arxiv:1803.10803v2 [math.oc] 28 jan 2019march 28, 2018; revised on jan 28, 2019 abstract in this...

51
On the Equivalence of Inexact Proximal ALM and ADMM for a Class of Convex Composite Programming Liang Chen * Xudong Li Defeng Sun and Kim-Chuan Toh § March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization problems, an (inexact) symmetric Gauss-Seidel based majorized multi-block proximal alternating direction method of multipliers (ADMM) is equivalent to an inexact proximal augmented Lagrangian method (ALM). This equivalence not only provides new perspectives for understanding some ADMM-type algorithms but also supplies meaningful guidelines on implementing them to achieve better computational efficiency. Even for the two-block case, a by-product of this equivalence is the convergence of the whole sequence generated by the classic ADMM with a step-length that exceeds the conventional upper bound of (1 + 5)/2, if one part of the objective is linear. This is exactly the problem setting in which the very first convergence analysis of ADMM was conducted by Gabay and Mercier in 1976, but, even under notably stronger assumptions, only the convergence of the primal sequence was known. A collection of illustrative examples are provided to demonstrate the breadth of applications for which our results can be used. Numerical experiments on solving a large number of linear and convex quadratic semidefinite programming problems are conducted to illustrate how the theoretical results established here can lead to improvements on the corresponding practical implementations. Keywords. Alternating direction method of multipliers; Augmented Lagrangian method; Symmetric Gauss- Seidel decomposition; Proximal term AMS Subject Classification. 90C25; 65K05; 90C06; 49M27; 90C20 1 Introduction Let X, Y and Z be three finite-dimensional real Hilbert spaces each endowed with an inner product denoted by , ·i and its induced norm denoted by k·k, where Y := Y 1 ×···× Y s is the Cartesian product of s finite- dimensional real Hilbert spaces Y i ,i =1,...,s, each endowed with the inner product, as well as the induced norm, inherited from Y. For any given y Y, we can write y =(y 1 ; ... ; y s ) with y i Y i , i =1,...,s. Here, * College of Mathematics and Econometrics, Hunan University, Changsha, 410082, China ([email protected]), and Department of Applied Mathematics, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong ([email protected]). The research of this author was supported by the National Natural Science Foundation of China (11801158, 11871205) and the Fundamental Research Funds for the Central Universities in China. School of Data Science, Fudan University, Shanghai 200433, China, and Shanghai Center for Mathematical Sciences, Fudan University, Shanghai 200433, China ([email protected]). The research of this author was supported by the Fundamental Research Funds for the Central Universities in China. Department of Applied Mathematics, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong (defeng. [email protected]). The research of this author was supported in part by a start-up research grant from the Hong Kong Polytechnic University. § Department of Mathematics, and Institute of Operations Research and Analytics, National University of Singapore, 10 Lower Kent Ridge Road, Singapore 119076 ([email protected]). The research of this author was supported in part by the Ministry of Education, Singapore, Academic Research Fund (R-146-000-257-112). 1 arXiv:1803.10803v2 [math.OC] 28 Jan 2019

Upload: others

Post on 25-Sep-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

On the Equivalence of Inexact Proximal ALM and ADMM for aClass of Convex Composite Programming

Liang Chen∗ Xudong Li† Defeng Sun‡ and Kim-Chuan Toh§

March 28, 2018; Revised on Jan 28, 2019

Abstract

In this paper, we show that for a class of linearly constrained convex composite optimization problems,an (inexact) symmetric Gauss-Seidel based majorized multi-block proximal alternating direction methodof multipliers (ADMM) is equivalent to an inexact proximal augmented Lagrangian method (ALM). Thisequivalence not only provides new perspectives for understanding some ADMM-type algorithms but alsosupplies meaningful guidelines on implementing them to achieve better computational efficiency. Even forthe two-block case, a by-product of this equivalence is the convergence of the whole sequence generated bythe classic ADMM with a step-length that exceeds the conventional upper bound of (1+

√5)/2, if one part

of the objective is linear. This is exactly the problem setting in which the very first convergence analysisof ADMM was conducted by Gabay and Mercier in 1976, but, even under notably stronger assumptions,only the convergence of the primal sequence was known. A collection of illustrative examples are providedto demonstrate the breadth of applications for which our results can be used. Numerical experiments onsolving a large number of linear and convex quadratic semidefinite programming problems are conductedto illustrate how the theoretical results established here can lead to improvements on the correspondingpractical implementations.

Keywords. Alternating direction method of multipliers; Augmented Lagrangian method; Symmetric Gauss-Seidel decomposition; Proximal term

AMS Subject Classification. 90C25; 65K05; 90C06; 49M27; 90C20

1 Introduction

Let X, Y and Z be three finite-dimensional real Hilbert spaces each endowed with an inner product denotedby 〈·, ·〉 and its induced norm denoted by ‖ · ‖, where Y := Y1 × · · · ×Ys is the Cartesian product of s finite-dimensional real Hilbert spaces Yi, i = 1, . . . , s, each endowed with the inner product, as well as the inducednorm, inherited from Y. For any given y ∈ Y, we can write y = (y1; . . . ; ys) with yi ∈ Yi,∀ i = 1, . . . , s. Here,

∗College of Mathematics and Econometrics, Hunan University, Changsha, 410082, China ([email protected]), and Departmentof Applied Mathematics, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong ([email protected]).The research of this author was supported by the National Natural Science Foundation of China (11801158, 11871205) and theFundamental Research Funds for the Central Universities in China.†School of Data Science, Fudan University, Shanghai 200433, China, and Shanghai Center for Mathematical Sciences, Fudan

University, Shanghai 200433, China ([email protected]). The research of this author was supported by the FundamentalResearch Funds for the Central Universities in China.‡Department of Applied Mathematics, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong (defeng.

[email protected]). The research of this author was supported in part by a start-up research grant from the Hong KongPolytechnic University.§Department of Mathematics, and Institute of Operations Research and Analytics, National University of Singapore, 10

Lower Kent Ridge Road, Singapore 119076 ([email protected]). The research of this author was supported in part by theMinistry of Education, Singapore, Academic Research Fund (R-146-000-257-112).

1

arX

iv:1

803.

1080

3v2

[m

ath.

OC

] 2

8 Ja

n 20

19

Page 2: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

and throughout this paper, we use the notation (y1; . . . ; ys) to mean that the vectors y1, . . . , ys are writtensymbolically in a column format.

In this paper, we shall focus on the following multi-block convex composite optimization problem

miny∈Y,z∈Z

p(y1) + f(y)− 〈b, z〉 | F∗y + G∗z = c , (1.1)

where p : Y1 → (−∞,+∞] is a (possibly nonsmooth) closed proper convex function, f : Y → (−∞,+∞) isa continuously differentiable convex function whose gradient is Lipschitz continuous, b ∈ Z and c ∈ X arethe given data, and F∗ and G∗ are the adjoints of the given linear mappings F : X → Y and G : X → Z,respectively. Despite the simple appearance of problem (1.1), we shall see in the next section that thismodel actually encompasses various important classes of convex optimization problems in both classical coreconvex programming as well as recently emerged models from a broad range of real-world applications. Aquintessential example of problem (1.1) is the dual of the following convex composite quadratic programming

minx

ψ(x) +

1

2〈x,Qx〉 − 〈c, x〉 | Gx = b

, (1.2)

where ψ : X→ (−∞,+∞] is a closed proper convex function, Q : X→ X is a self-adjoint positive semidefinitelinear operator, G : X→ Z is a linear mapping, and c ∈ X, b ∈ Range(G) (i.e., b is in the range space of thelinear operator G) are the given data. The dual of problem (1.2) in the minimization form can be written asfollows:

miny1,y2,z

ψ∗(y1) +

1

2〈y2,Qy2〉 − 〈b, z〉

∣∣ y1 +Qy2 − G∗z = c

, (1.3)

where ψ∗ is the Fenchel conjugate of ψ, y1 ∈ X, y2 ∈ X and z ∈ Z, so that problem (1.3) constitutes aninstance of problem (1.1).

To solve problem (1.1), one of the most preferred approaches is the augmented Lagrangian method(ALM) initiated by Hestenes [23] and Powell [42], and elegantly studied for general (without taking intoaccount of the multi-block structure) convex optimization problems in the seminal work of Rockafellar [45].Given a penalty parameter σ > 0, the augmented Lagrangian function corresponding to problem (1.1) isdefined by

Lσ(y, z;x) := p(y1) + f(y)− 〈b, z〉+ 〈x,F∗y + G∗z − c〉+σ

2‖F∗y + G∗z − c‖2,

∀ (x, y, z) ∈ X×Y ×Z.

Starting from a given initial multiplier x0 ∈ X, the ALM performs the following steps at the k-th iteration:

(1) compute (yk+1, zk+1) to (approximately) minimize the function Lσ(y, z;xk), and

(2) update the multipliers xk+1 := xk + τσ(F∗yk+1 + G∗zk+1 − c), where τ ∈ (0, 2) is the step-length.

While one would really want to solve miny,z Lσ(y, z;xk) as it is without modifying the augmented Lagrangianfunction, it can be expensive to minimize Lσ(y, z;xk) with respect to both y and z simultaneously, dueto the coupled quadratic term in y and z. Thus, in practice, unless the ALM is converging rapidly, onewould generally want to replace the augmented Lagrangian subproblem with an easier-to-solve surrogate bymodifying the augmented Lagrangian function to decouple the minimization with respect to y and z. Such amodification is especially desirable during the initial phase of the ALM when its local superlinear convergencehas yet to kick in. The most obvious approach to decouple the subproblem for obtaining (yk+1, zk+1) is toadd to Lσ(y, z;xk) the proximal term σ

2 ‖(y; z)− (yk; zk)‖2Λ, where Λ = λ2I − (F ;G)(F ;G)∗ with λ being thelargest singular value of (F ;G) and I being the identity operator in Y ×Z. However, such a modification tothe augmented Lagrangian function is generally too drastic and has the undesirable effect of significantlyslowing down the convergence of the ALM [6, Section 7]. This naturally leads us to the important questionon what is an appropriate proximal term to add to Lσ(y, z;xk) such that the ALM subproblem is easier to

2

Page 3: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

solve while at the same time it is less drastic than the obvious choice we have just mentioned in the previoussentences.

We shall show in this paper that by adding an appropriately designed proximal term to Lσ(y, z;xk), wecan reduce the computation of the modified ALM subproblem to sequentially updating y and z via computing

yk+1 ≈ miny

Lσ(y, zk;xk)

and zk+1 ≈ min

z

Lσ(yk+1, z;xk)

.

The reader would have observed that the resulting proximal ALM updating scheme is the same as the classictwo-block ADMM (pioneered by Glowinski and Marroco [21] and Gabay and Mercier [18]) that is appliedto problem (1.1). However, there is a crucial difference in that our convergence result holds true for thestep-length τ in the range (0, 2), whereas the classic two-block ADMM only allows the step-length to be inthe interval

(0, (1 +

√5)/2

)if the convergence of the full sequence generated by the algorithm is required. It

is important to note that even with the sequential minimization of y and z in the modified ALM subproblem,the minimization subproblem with respect to y can still be very difficult to solve due to the coupling of theblocks y1, . . . , ys in (1.1). One of the main contributions we made in this paper is to show that by majorizingthe function f(y) at yk with a quadratic function and by adding an extra proximal term that is derived basedon the block symmetric Gauss-Seidel (sGS) decomposition theorem [32] for the quadratic term associatedwith y, we are able to update the sub-blocks in y individually in a symmetric Gauss-Seidel fashion. A crucialimplication of this result is that the (inexact) block sGS decomposition based multi-block majorized ADMMis equivalent to an inexact majorized proximal ALM. Consequently, we are able to prove the convergence ofthe whole sequence generated by the former even when the step-length is in the range (0, 2).

In this paper, we shall not delve into the vast literature on both ALM and ADMM, as well as theirvariants, and their relationships to the proximal point method and operator splitting methods. Theyare simply too abundant for us to list even a few of them here. Thus, we shall only refer to those thatare most relevant for our work in this paper. Here we should mention that many attempts have beenmade in recent years on designing convergent multi-block ADMM-type algorithms that can outperformthe directly extended multi-block (proximal ADMM) numerically. While the latter is not guaranteed toconverge even under the strong assumption that f ≡ 0, paradoxically its practical numerical performanceis often better than many convergent variants that have been developed in the past; see for example [48].Against this backdrop, we should mention that the ADMM-type algorithms that have been progressivelydesigned in [48, 30, 6] not only come with convergence guarantee but they have also been demonstratedto have superior numerical performance than the directly extended ADMM, at least for a large number ofconvex conic programming problems. More recently, those algorithms have found applications in variousareas [1, 2, 9, 17, 27, 31, 53, 55, 56]. Among those algorithms, the most general and versatile one is therecently developed inexact majorized multi-block proximal ADMM in Chen et al. [6], which we shall brieflydescribe in the next paragraph.

Under the assumption that the gradient of f is Lipschitz continuous, we know that one can specify afixed self-adjoint positive semidefinite linear operator Σf : Y → Y and define at each y′ ∈ Y the followingconvex quadratic function

f(y, y′) := f(y′) + 〈∇f(y′), y − y′〉+1

2‖y − y′‖2

Σf, ∀ y ∈ Y, (1.4)

such thatf(y) ≤ f(y, y′), ∀y, y′ ∈ Y and f(y′) = f(y′, y′), ∀y′ ∈ Y.

Thus, we say that at each y′ ∈ Y, the function f(·, y′) constitutes a majorization of the function f . Let σ > 0be the penalty parameter. Based on the notion of majorization described above, the majorized augmentedLagrangian function of problem (1.1) is defined by

Lσ(y, z; (x, y′)

):= p(y1) + f(y, y′)− 〈b, z〉+ 〈F∗y + G∗z − c, x〉

2‖F∗y + G∗z − c‖2, ∀ (y, z, x, y′) ∈ Y ×Z×X×Y.

(1.5)

3

Page 4: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Let (x0, y0, z0) ∈ X×Y ×Z be a given initial point with y01 ∈ dom p, and Di : Yi → Yi, i = 1, . . . , s be the

given self-adjoint linear operators, for the purpose of facilitating the computations of the subproblems. Forconvenience, we denote for any y = (y1; . . . ; ys) ∈ Y1 × · · · ×Ys,

y<i := (y1; . . . ; yi−1) and y>i := (yi+1; . . . ; ys), ∀ i = 1, . . . , s.

Then, the k-th step of the (inexact) block sGS decomposition based majorized multi-block proximal ADMMin [6], when applied to problem (1.1), takes the following form

yk+ 1

2i ≈ arg min

yi∈Yi

Lσ((yk<i; yi; y

k+ 12

>i

), zk; (xk, yk)

)+ 1

2‖yi − yki ‖2Di

, i = s, . . . , 2 ;

yk+1i ≈ arg min

yi∈Yi

Lσ((yk+1<i ; yi; y

k+ 12

>i

), zk; (xk, yk)

)+ 1

2‖yi − yki ‖2Di

, i = 1, . . . , s ;

zk+1 ≈ arg minz∈Z

Lσ(yk+1, z; (xk, yk)

);

xk+1 = xk + τσ(F∗yk+1 + G∗zk+1 − c

),

(1.6)

where τ ∈(0, (1 +

√5)/2

)was allowed in [6]. As one can observe from (1.5) and (1.6), the quadratic

majorization technique in Li et al. [29] was used to replace the original augmented Lagrangian functionby the majorized augmented Lagrangian function. This in turn enables us to employ the inexact blocksGS decomposition technique in Li et al. [32] to sequentially update the sub-blocks of y individually. Moreimportantly, the algorithm is highly flexible in that all the subproblems are allowed to be solved approximatelyto overcome possible numerical obstacles such as, for example, when iterative solvers must be employed tosolve large-scale linear systems to overcome extreme memory requirement and prohibitive computing cost. Ithas already been demonstrated in [6] that the inexact block sGS decomposition based multi-block ADMMis far superior to the directly extended ADMM in solving high-dimensional linear and convex quadraticsemidefinite programming with the step-length in (1.6) being restricted to be less than (1 +

√5)/2.

Our focus in this paper is to investigate whether the framework in (1.6) can be proven to be convergentfor problem (1.1) when the step-length τ is in the range (0, 2). In particular, we will show that the inexactblock sGS decomposition based multi-block ADMM (1.6) is equivalent to an inexact majorized proximalALM in the sense that computations of yk+1, zk+1 and xk+1 in (1.6) can equivalently be written as

(yk+1, zk+1

)≈ arg min

(y, z)∈Y×Z

Lσ(y, z; (xk, yk)

)+ 1

2‖(y; z)−(yk; zk

)‖2T

;

xk+1 = xk + τσ(F∗yk+1 + G∗zk+1 − c

),

where T : Y ×Z→ Y ×Z is a self-adjoint (not necessarily positive definite) linear operator whose precisedefinition will be given later, and ‖(y; z)‖2T := 〈(y; z), T (y; z)〉, ∀ (y, z) ∈ Y × Z. This connection not onlyprovides new theoretical perspectives for analyzing multi-block ADMM-type algorithms, but also has thepotential of allowing them to achieve even better computational efficiency since a larger step-length beyond(1 +

√5)/2 can now be taken in (1.6), without adding any extra conditions or any additional verification

steps such as those extensively used in [48, 30, 6, 5].The main contributions of this paper are as follows.

• We derive the equivalence of an (inexact) block sGS decomposition based multi-block majorized proximalADMM to an inexact majorized proximal ALM, and establish the global and local convergence propertiesof the latter with the step-length τ ∈ (0, 2). As a result, the global and local convergence properties ofthe former even with τ ∈ (0, 2) are also established.

• Even for the most conventional two-block case, we are able for the first time to rigorously characterizethe connection between ADMM and proximal ALM. Note that given the form of the updating rulesof the classic ADMM and ALM, although it is natural to view ADMM as an approximate versionof the ALM, this is not completely true as can be seen from our analysis in this paper. Indeed, to

4

Page 5: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

alleviate the difficulty of solving the subproblems in the ALM, the classic ADMM uses a single cycleof the Gauss-Seidel block minimization to replace the full minimization of the augmented Lagrangianfunction in the ALM. This viewpoint in fact motivated the study of the classic ADMM in the veryfirst paper [21]. However, as was mentioned in [13, 14], there were no known results in quantifying thisinterpretation.

• As a by-product of the second contribution, this paper gives an affirmative answer to the open questionon whether the dual sequence generated by the classic ADMM with τ ∈ (0, 2) is convergent if one ofthe two functions in the objective is linear1. This is the problem setting of the very first proof forthe ADMM in Gabay and Mercier [18, Theorem 3.1] in which the dual sequence is only guaranteedto be bounded, even under very strong assumptions. The later proof of Glowinski [20, Chapter 5,Theorem 5.1] established stronger results than [18] but it requires τ ∈

(0, (1 +

√5)/2

). Thereafter,

only the latter interval, and especially the unit step-length, has been considered. In fact, in a rigorousproof presented recently in [5] for the classic two-block ADMM with τ ∈

(0,(1 +√

5)/2), it was shown

that the convergence of the dual sequence can be guaranteed under pretty weak conditions but theconvergence of the primal sequence requires more. Hence, it is of much theoretical interest to clarifywhether the dual sequence is convergent if the objective contains a linear part while τ ≥

(1 +√

5)/2.

• We provide a fairly general criterion for choosing the possibly indefinite2 linear operators Di, i = 1, . . . , s,in the proximal terms, which unifies those used in Chen et al. [6] and those used in Zhang et al. [57] toguarantee the viability of the block sGS decomposition techniques and the convergence of the wholesequence generated by the algorithm in (1.6). Recall that the proximal terms in [6] should be positivesemidefinite while in [57] the functions being majorized should be separable with respect to each blockof variables. Here, we do not require f to be separable and indefinite proximal terms are allowed.

• We use a unified criterion, which is weaker than those used in [6], for choosing the proximal terms inthe algorithmic framework (1.6) and analyzing its convergence. Note that in [6], compared with thecondition [6, (3.2)] imposed on choosing the proximal terms, a stronger condition ([6, (5.26) of Theorem5.1]) was used to guarantee the convergence of the algorithm. Here, we are able to get rid of such a gapwhile using a weaker condition.

• We conduct extensive numerical experiments on solving the linear and convex quadratic semidefiniteprogramming (SDP) problems to demonstrate how the theoretical results obtained here can be exploitedto improve the numerical efficiency of the implementation on ADMM. Based on the numerical results,together with the theoretical analysis in this paper, we are able to give a plausible explanation as towhy ADMM often performs well when the dual step-length is chosen to be the golden ratio of 1.618.Meanwhile, a guiding principle on choosing the step-length during the practical implementation of thealgorithmic framework in (1.6) is derived.

Here we emphasize again that for solving large-scale instances of the multi-block problem (1.1), asuccessful multi-block ADMM-type algorithm must not only possess convergence guarantee but should alsonumerically perform at least as fast as the directly extended ADMM. Based on our work in this paper, wecan conclude that the inexact block sGS decomposition based majorized proximal ADMM studied in [6]indeed does possess those desirable properties. Moreover, this algorithm is a versatile framework and one canapply it to problem (1.1) in different routines other than (1.6). The reason that we are more interested in theiteration scheme (1.6) is not only for the theoretical improvement one can achieve, but also for the practicalmerit it features for solving large scale problems, especially when the dominating computational cost is inperforming the evaluations associated with the linear mappings G and G∗. A particular case in point is thefollowing problem:

minx∈X

ψ(x) +

1

2〈x,Qx〉 − 〈c, x〉 | GEx = bE , GIx ≥ bI

, (1.7)

1This question was first resolved in [48] when the initial multiplier x0 satisfies Gx0 − b = 0 and all the subproblems are solvedexactly.

2One may refer to [29] for the details that motivating the use of indefinite proximal terms in the 2-block majorized proximalADMM, especially [29, Section 6] on their computational merits, as well as [57] for the similar results in multi-block cases.

5

Page 6: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

where Q, ψ, and c have the same meaning as in (1.3), GE : X→ ZE and GI : X→ ZI are the given linearmappings, and b = (bE ; bI) ∈ Z := ZE ×ZI is a given vector. By introducing a slack variable x′ ∈ ZI , theabove problem can be equivalently reformulated as

minx∈X,x′∈ZI

ψ(x) +

1

2〈x,Qx〉 − 〈c, x〉 |

(GE 0GI IZI

)(xx′

)= b, x′ ≤ 0

,

where IZI is the identity operator in ZI . The corresponding dual problem in the minimization form is thengiven by

miny1,y′2,z

p(y1) +

1

2〈y2,Qy2〉 − 〈b, z〉 |

(y11

y12

)+

(Q0

)y2 −

(G∗E G∗I0 IZ2

)z =

(c0

),

where y1 := (y11; y12) ∈ X × ZI , p(y1) := ψ∗1(y11) + δ+(y12) with δ+ being the indicator function of thenonnegative orthant in ZI , y2 ∈ X and z ∈ Z. It is clear that when problem (1.7) has a large number ofinequality constraints, the dimension of Z can be much larger than that of X. For such a scenario, theiteration scheme (1.6) is more preferable since the more difficult subproblem involving z is solved only oncein each iteration.

Organization

This paper is organized as follows. In Section 2, we present a few important classes of problems that can behandled by (1.1) to illustrate the wide applicability of this model. In Section 3, we design an inexact majorizedproximal ALM framework and establish its global and local convergence properties. In Section 4, we showthe key result that the sequence generated by the inexact block sGS decomposition based majorized proximalADMM (1.6), together with a simple error tolerance criterion, is equivalent to the sequence generated bythe inexact ALM framework introduced in Section 3. Accordingly, the convergence of the two-block ADMMwith the step-length in the interval of (0, 2) is also established for problem (1.1) with s = 1. In Section 5, weconduct extensive numerical experiments on the 2-block dual linear SDP problems and the multi-block dualconvex quadratic SDP problems to illustrate the numerical efficiency of the proposed algorithm, as well asthe impact of the step-length on its numerical performance. A few important practical observations from thenumerical results are also presented. Finally, we conclude this paper in the last section.

Notation

• Let H and H′ be two finite-dimensional real Hilbert spaces each endowed with an inner product 〈·, ·〉and its induced norm ‖ · ‖. We also use ‖ · ‖ to denote the norm induced on the product space H×H′by the inner product 〈(ν1, ν

′1), (ν2, ν

′2)〉 := 〈ν1, ν2〉+ 〈ν′1, ν′2〉,∀ν1, ν2 ∈ H,∀ν′1, ν′2 ∈ H′.

• For any linear map O : H→ H′, we use O∗ to denote its adjoint, O−1 to denote its inverse (if invertible),O† to denote its Moore–Penrose pseudoinverse, Range(O) to denote its range space, and ‖O‖ to denoteits spectral norm.

• If H′ = H and O is self-adjoint and positive semidefinite, there must be a unique self-adjoint positivesemidefinite operator, denoted by O1/2, such that O1/2O1/2 = O. In this case, for any ν, ν′ ∈ H wedefine 〈ν, ν′〉O := 〈Oν, ν′〉 and ‖ν‖O :=

√〈ν,Oν〉 = ‖O1/2ν‖. If O is also invertible, O1/2 is invertible

and we use the notation that O−1/2 := (O1/2)−1 .

• Let O1, . . . ,Ok be k self-adjoint linear operators, we used Diag(O1, . . . ,Ok) to denote the block-diagonallinear operator whose block-diagonal elements are in the order of O1, . . . ,Ok.

• For any convex set H ⊆ H, we denote the relative interior of H by ri(H). When the self-adjoint linearoperator O : H→ H positive definite, we define, for any ν ∈ H,

distO(ν,H) := infν′∈H

‖ν − ν′‖O and ΠOH(ν) = arg minν′∈H

‖ν − ν′‖O.

6

Page 7: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

If O is the identity operator we just omit it from the notation so that dist(·, H) and ΠH(·) are thestandard distance function and the metric projection operator, respectively.

• Let θ : H → (−∞,+∞] be an arbitrary closed proper convex function. We use dom θ to denote itseffective domain, ∂θ to denote its subdifferential mapping, and θ∗ to denote its conjugate function.Moreover, for a given self-adjoint and positive definite linear operator O : H → H, we use ProxOθ todenote the Moreau-Yosida proximal mapping of θ, which is defined by

ProxOθ (ν) := arg minν′∈H

θ(ν′) + 1

2‖ν − ν′‖2O, ∀ν ∈ H.

Note that the mapping ProxOθ is globally Lipschitz continuous. If O is the identity operator, we willdrop O from ProxOθ (·).

2 Illustrative Examples

In this section, we present a few important classes of concrete problems, including those in the classic coreconvex programming as well as those which are popularly used in various real-world applications. As willbe shown, these problems and/or their dual problems have the form given by (1.1), so that the algorithmdesigned in this paper can be utilized to solve them.

2.1 Convex Composite Quadratic Programming

It is well known that many problems are subsumed under the convex composite quadratic programming model(1.2) or the more concrete form (1.7). For example, it includes the important classes of convex quadraticprogramming (QP), the convex quadratic semidefinite programming (QSDP), and the convex quadraticprogramming and weighted centering [41] (QPWC). As an illustration, consider a convex QSDP problem inthe following form

minX∈Sn

1

2〈X,QX〉 − 〈C,X〉

∣∣∣ AEX = bE , AIX ≥ bI , X ∈ Sn+, (2.1)

where Sn is the space of n×n real symmetric matrices and Sn+ is the closed convex cone of positive semidefinitematrices in Sn, Q : Sn → Sn is a positive semidefinite linear operator, C ∈ Sn is a given matrix, and AE

and AI are the linear maps from Sn to the two finite-dimensional Euclidean spaces RmE and RmI thatcontaining bE and bI , respectively. To solve this problem, one may consult the recently developed softwareQSDPNAL in Li et al. [31] and the references therein. The algorithm implemented in QSDPNAL is atwo-phase augmented Lagrangian method in which the first phase is an inexact sGS decomposition basedmulti-block proximal ADMM whose convergence was established in [6, Theorem 5.1]. The solution generatedin the first phase is used as the initial point to warm-start the second phase algorithm, which is an ALMwith the inner subproblem in each iteration being solved via an inexact semismooth Newton algorithm. InSection 5, we will use the QSDP problem (2.1) to test the algorithm studied in this paper.

Besides the core optimization problems just mentioned above, there are many problems from real-wordapplications that can be cast in the form of (1.2) and the following are only a few such examples.

Penalized and Constrained Regression Models

In various statistical applications, the penalized and constrained (PAC) regression [25] often arises in high-dimensional generalized linear models with linear equality and inequality constraints. A concrete example ofthe PAC regression is the following constrained lasso problem

minx∈Rn

1

2‖Φx− η‖2 + λ‖x‖1 | AEx = bE , AIx ≥ bI

, (2.2)

7

Page 8: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

where Φ ∈ Rm×n, AE ∈ RmE×n, AI ∈ RmI×n, η ∈ Rm, bE ∈ RmE and bI ∈ RmI are the given data, andλ > 0 is a given regularization parameter. The statistical properties of problem (2.2) have been studied in[25]. For more details on the applications of the model (2.2), one may refer to [25, 19] and the referencestherein. In Gaines et al. [19], the authors considered solving (2.2) by first reformulating it as a conventionalQP via letting x = x+ − x− and adding the extra constraints x+ ≥ 0, x− ≥ 0, and then applying the primalADMM to solve the conventional QP, in which all the subproblems should be solved exactly (or to veryhigh accuracy) by iterative methods. Such a combination may perform well for low dimensional problemswith moderate sample sizes. But for the more challenging and interesting high-dimensional cases where n isextremely large and m n, the approach in [19] is likely to face severe numerical difficulties because of thepresence of a huge number of constraints. Fortunately, the algorithm we designed in this paper can preciselyhandle those difficult cases because the large linear systems associated with the huge number of constraintsare not required to solve to very high accuracy by an iterative solver.

Noisy Matrix Completion and Rank-Correction Step

In Miao et al. [36], the authors introduced a rank-correction step for matrix completion with fixed basiscoefficients to overcome the shortcomings of the nuclear norm penalization model for such problems. LetX ∈ Vn1×n2 (where Vn1×n2 may represent the space of n1 × n2 real or complex matrices or the space of

n × n real symmetric or Hermitian matrices) be the unknown true low-rank matrix and Xm is an initialestimator of X from the nuclear norm penalized least squares model. The rank-correction step is to solve thefollowing convex optimization problem

minX

12m‖y −Po(X)‖2 + ρm

(‖X‖∗ − 〈F (Xm), X〉

)s.t. PA(X) = PA(X), ‖PB(X)‖∞ ≤ b,

(2.3)

where y = Po(X) + ε ∈ Rm is the observed data for the matrix X, Po is the linear map corresponding to theobserved entries, ε ∈ Rm is the unknown error, ρm > 0 is a given penalty parameter, and F : Vn1×n2 → Vn1×n2

is a spectral operator [10] whose precise definition can be found in [36, Section 5]. Here the constraintsPA(X) = PA(X) and ‖PB(X)‖∞ ≤ b represent the fixed elements and bounded elements of X, respectively.If F and the equality constraints are vacuous, problem (2.3) is exactly the noise matrix completion modelconsidered in [37], and a similar matrix completion model can be found in [26]. One may view (2.3) as aninstance of problem (1.2), and whose corresponding linear operator Q admits a very simple form.

2.2 Two-Block Problems

Next we present a few important classes of two-block problems whose objective functions contain a linearpart.

Semidefinite Programming

One of the most prominent examples of problem (1.1) with 2 blocks of variables (i.e., s = 1) is the dual linearsemidefinite programming (SDP) problem given by

minY, z

δSn+(Y )− 〈b, z〉 | Y + A∗z = C

, (2.4)

where A : Sn → Rm is a given linear map, and b ∈ Rm and C ∈ Sn are given data. The notation δSn+ denotes

the indicator function of Sn+. For problem (2.4), various ADMM algorithms have been employed to solve theproblem. As far as we are aware of, the classic two-block ADMM with unit step-length was first employed inPovh et al. [43] under the name of boundary point method for solving the SDP problem (2.4). It was laterextended in Malick et al. [34] with a convergence proof. The ADMM approach was later used in the softwareSDPNAL developed by Zhao et al. [58] to warm-start a semismooth Newton method based ALM for solvingproblem (2.4).

8

Page 9: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

In section 5, we will conduct extensive numerical experiments on solving a few classes of linear SDPproblems via the two-block ADMM algorithm but with the dual step-length being chosen in the interval(0, 2), as it is guaranteed by this paper.

Equality Constrained Problems

Consider the equality constrained problem

minx∈X

θ(x) | Gx = b

, (2.5)

where G : X → Rm is a linear map, b ∈ Rm is a given vector, and θ : X → (−∞,+∞] is a simple closedproper convex function such that its proximal mapping can be computed efficiently. The dual problem of(2.5) can be written in the minimization form as

miny,zθ∗(y)− 〈b, z〉 | y − G∗z = 0 . (2.6)

A concrete example of problem (2.5), with X := Rn and θ(x) := ‖x‖1, is the basis pursuit (BP) problem [7],which has been wildly used in sparse signal recovery and image restoration. Another example of (2.5) is thenuclear norm based matrix completion problem for which X := Rn1×n2 and θ(x) = ‖x‖∗. Moreover, the socalled tensor completion problem [33] also falls into this category.

We note that for the application problems just mentioned above, the dimension of X is generally muchlarger than m, i.e., the dimension of the linear constraints. Therefore from the computational viewpoint, it isgenerally more economical to apply the two-block ADMM to the dual problem (2.6) instead of the primalproblem (2.5) (by introducing an extra variable x′ and adding the condition x− x′ = 0) because the formerwill solve smaller m×m linear systems in each iteration whereas the latter will correspondingly need to solvemuch larger linear systems.

Composite Problems

A composite problem can take the following form

minz∈Z

f (c− G∗z) , (2.7)

where f : Z→ (−∞,+∞] is a (possibly nonsmooth) closed proper convex function whose proximal mappingcan be computed efficiently, G : Z→ X is a given linear operator and c ∈ X is given data. By introducing aslack variable, problem (2.7) can be recast as

miny,z

f(y) | y + G∗z = c

.

Problem (2.7) contains many real-world applications such as the well-known least absolute deviation (LAD)problem (also known as least absolute error (LAE), least absolute value (LAV), least absolute residual (LAR),sum of absolute deviations, or the `1-norm condition). The model (2.7) also includes the Huber fittingproblem [24]. We shall not continue with more examples as there are too many applications to be listed hereto serve as a literature review.

Consensus Optimization

Consider the following problem

minz∈Z

n∑i=1

fi (G∗i z)

, (2.8)

where each fi is a closed proper convex function and each Gi : Yi → Z is a linear operator. The model(2.8) includes the global variable consensus optimization and general variable optimization, as well as their

9

Page 10: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

regularized versions (see [4, Section 7]), which have been well applied in many areas such as machine learning,signal processing and wireless communication [4, 3, 49, 46, 59]. In the consensus optimization setting, it isusually preferable to solve subproblems each involving a subset of the component functions f1, . . . , fn insteadof all of them. Therefore, one can equivalently recast problem (2.8) as

miny,z

n∑i=1

fi(yi) | yi − G∗i z = 0, 1 ≤ i ≤ n

. (2.9)

Obviously, when applying the two-block ADMM to solve (2.9), the subproblem with respect to y is separatedinto n independent problems that can be solved in parallel. In [4], the variable z in (2.9) is called thecentral collector. Besides, the network based decentralized and distributed computation of the consensusoptimization, such as the distributed lasso in [35], also falls in the problems setting in this paper.

3 An Inexact Majorized ALM with Indefinite Proximal Terms

In this section, we present an inexact majorized indefinite-proximal ALM. This algorithm, as well as its globaland local convergence properties, not only constitutes a generalization of the original (proximal) ALM, butalso paves the way for us to establish its equivalence relationship with the inexact block sGS decompositionbased indefinite-proximal multi-block ADMM in the next section.

Let X and W be two finite-dimensional real Hilbert spaces each endowed with an inner product 〈·, ·〉and its induced norm ‖ · ‖. We consider the following fairly general linearly constrained convex optimizationproblem

minw∈W

ϕ(w) + h(w) | A∗w = c

, (3.1)

where ϕ : W → (−∞,+∞] is a closed proper convex function, h : W → (−∞,+∞) is a continuouslydifferentiable convex function whose gradient is Lipschitz continuous, A : X→W is a linear mapping andc ∈ X is the given data. The Karush-Kuhn-Tucker (KKT) system of problem (3.1) is given by

0 ∈ ∂ϕ(w) +∇h(w) +Ax, A∗w − c = 0. (3.2)

For any (w, x) ∈W ×X that solve the KKT system (3.2), w is a solution to problem (3.1) while x is a dualsolution of (3.1).

The fact that the gradient of h is Lipschitz continuous implies that there exists a self-adjoint positivesemidefinite linear operator Σh : W→W, such that for any w′ ∈W, h(w) ≤ h(w,w′), where

h(w,w′) := h(w′) + 〈∇h(w′), w − w′〉+1

2‖w − w′‖2

Σh, ∀w ∈W. (3.3)

We call the function h(·, w′) : W→ (−∞,+∞) a majorization of h at w′. The following result, whose proofcan be found in [57, Lemma 3.2], will be used later.

Lemma 3.1. Suppose that (3.3) holds for any given w′ ∈W. Then, it holds that⟨∇h(w)−∇h(w′), w′′ − w′

⟩≥ −1

4‖w − w′′‖2

Σh, ∀w,w′, w′′ ∈W.

Let σ > 0 be a given penalty parameter. The majorized augmented Lagrangian function associated withproblem (3.1) is defined by

Lσ(w; (x,w′)) := ϕ(w) + h(w,w′) + 〈A∗w − c, x〉+ σ2 ‖A

∗w − c‖2,

∀(w, x,w′) ∈W ×X×W.(3.4)

In the following, we propose an inexact majorized indefinite-proximal ALM in Algorithm iPALM for solvingproblem (3.1). This algorithm is an extension of the proximal method of multipliers developed by Rockafellar

10

Page 11: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

[45], with new ingredients added based on the recent progress on using proximal terms which are not necessarilypositive definite [16, 29, 57] and the implementable inexact minimization criteria studied in [6]. For theconvenience of later convergence analysis, we make the following blanket assumption.

Assumption 3.1. The solution set to the KKT system (3.2) is nonempty and S : W → W is a givenself-adjoint (not necessarily positive semidefinite) linear operator such that

S −1

2Σh and

1

2Σh + σAA∗ + S 0. (3.5)

We are now ready to present Algorithm iPALM that will be studied in this section.

Algorithm iPALM An inexact majorized indefinite-proximal ALM

Let εk be a summable sequence of nonnegative numbers. Choose an initial point (x0, w0) ∈ X×W. Fork = 0, 1, . . ., perform the following steps in each iteration.

Step 1. Compute

wk+1 ≈ wk+1 := arg minw∈W

Lσ(w; (xk, wk)

)+

1

2‖w − wk‖2S

(3.6)

such that there exists a vector dk ∈W satisfying ‖dk‖ ≤ εk and

dk ∈ ∂wLσ(wk+1; (xk, wk)

)+ S(wk+1 − wk). (3.7)

Step 2. Compute xk+1 := xk + τσ(A∗wk+1 − c) with τ ∈ (0, 2) being the step-length.

We shall next proceed to analyze the global convergence, the rate of local convergence and the iterationcomplexity of Algorithm iPALM. For notational convenience, we collect the total quadratic information inthe objective function of (3.6) as the following linear operator

M := Σh + S + σAA∗. (3.8)

The following result presents two important inequalities for the subsequent analysis. The first one characterizesthe distance (with M being involved in the metric) from the computed solution to the true solution of thesubproblem in (3.6), while the second one presents a non-monotone descent property about the sequencegenerated by Algorithm iPALM.

Proposition 3.1. Suppose that Assumption 3.1 holds. Then,

(a) the sequence (xk, wk) generated by Algorithm iPALM and the auxiliary sequence wk defined in (3.6)are well-defined, and it holds that

‖wk+1 − wk+1‖2M ≤ 〈dk, wk+1 − wk+1〉; (3.9)

(b) for any given (x∗, w∗) ∈ X×W that solves the KKT system (3.2) and k ≥ 1, it holds that(1

2τσ‖xk+1

e ‖2 +1

2‖wk+1

e ‖2Σh+S

)−(

1

2τσ‖xke‖2 +

1

2‖wke‖2Σh+S

)≤ −

((2− τ)σ

2‖A∗wk+1

e ‖2 +1

2‖wk+1 − wk‖21

2 Σh+S − 〈dk, wk+1

e 〉),

(3.10)

where xe := x− x∗, ∀x ∈ X and we := w − w∗, ∀w ∈W.

11

Page 12: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Proof. (a) From (3.5) and (3.8) we know that M 0. Hence, each of the subproblems in Algorithm iPALMis strongly convex so that each wk+1 is uniquely determined by (xk, wk). Note that, for the given εk ≥ 0, onecan always find a certain wk+1 such that ‖dk‖ ≤ εk with dk being given in (3.7), see [12, Lemma 4.5]. Hence,Algorithm iPALM is well-defined. According to (3.3) and (3.4), the objective function in (3.6) is given by

ϕ(w) + 〈∇h(wk) +Axk, w〉+σ

2‖A∗w − c‖2 +

1

2‖w − wk‖2

Σh+S ,

so that (3.7) implies that

dk ∈ ∂ϕ(wk+1) +∇h(wk) +Axk + σA(A∗wk+1 − c) + (Σh + S)(wk+1 − wk). (3.11)

Therefore, from the definitions of the Moreau-Yosida proximal mapping and M in (3.8), one has that

wk+1 = ProxMϕ

(M−1

[dk − (∇h(wk) +Axk − σAc− (Σh + S)wk)

]).

Consequently, by the Lipschitz continuity of ProxMϕ [28, Proposition 2.3] and the fact that dk can be set as

zero if wk+1 = wk+1, one can readily get (3.9).

(b) Let (x∗, w∗) ∈ X ×W be an arbitrary solution to the KKT system (3.2). Obviously, one has that−∇h(w∗)−Ax∗ ∈ ∂ϕ(w∗) and A∗w∗ = c. This, together with (3.11) and the maximal monotonicity of ∂ϕ,implies that

〈dk −∇h(wk) +∇h(w∗)−Axke − σA(A∗wk+1 − c)− (Σh + S)(wk+1 − wk), wk+1e 〉 ≥ 0.

Therefore, by using the fact that

A∗wk+1 − c = A∗wk+1e =

1

τσ(xk+1 − xk), (3.12)

one can obtain from the above inequality and Lemma 3.1 that⟨dk, wk+1

e

⟩− 1

τσ

⟨xke , x

k+1 − xk⟩− σ‖A∗wk+1

e ‖2 −⟨

(Σh + S)(wk+1 − wk), wk+1e

⟩≥⟨∇h(wk)−∇h(w∗), wk+1

e

⟩≥ − 1

4‖wk+1 − wk‖2

Σh= − 1

2‖wk+1 − wk‖21

2 Σh.

(3.13)

Note that 〈xke , xk+1 − xk〉 = 12‖x

ke‖2 − 1

2‖xke‖2 − 1

2‖xk+1 − xk‖2 and⟨

(Σh + S)(wk+1 − wk), wk+1e

⟩=

1

2‖wk+1 − wk‖2

Σh+S +1

2‖wk+1

e ‖2Σh+S −

1

2‖wke‖2Σh+S .

Then, (3.10) follows form (3.13) and this completes the proof of the proposition.

3.1 Global Convergence

For the convenience of our analysis, we define the following two linear operators

Ξ := τσ

(1

2Σh + S +

(2− τ)σ

6AA∗

)and Θ := τσ

(Σh + S +

(2− τ)σ

3AA∗

), (3.14)

which will be used in defining metrics in W. Note that τ ∈ (0, 2). If (3.5) in Assumption 3.1 holds, one hasthat

1τσΞ = 2−τ

6

(12 Σh + S + σAA∗

)+(1− 2−τ

6

) (12 Σh + S

) 2−τ

6

(12 Σh + S + σAA∗

) 0,

1τσΘ = 1

τσΞ + 12 Σh + (2−τ)σ

6 AA∗ 1τσΞ 0.

(3.15)

12

Page 13: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Moreover, we define the block-diagonal linear operator Ω : U→ U by

Ω(x;w) :=(x; Θ

12w), ∀(x,w) ∈ X×W, (3.16)

where Θ is given by (3.14). Now we establish the convergence theorem of Algorithm iPALM. The correspondingproof mainly follows from the proof of [6, Theorem 5.1] for the convergence of an inexact majorized semi-proximal ADMM and the following result on quasi-Fejer monotone sequence will be used.

Lemma 3.2. Let akk≥0 be a sequence of nonnegative real numbers sequence satisfying ak+1 ≤ ak + εk forall k ≥ 0, where εkk≥0 is a nonnegative and summable sequence of real numbers. Then the ak convergesto a unique limit point.

Theorem 3.1. Suppose that Assumption 3.1 holds and the sequence (xk, wk) is generated by AlgorithmiPALM. Then,

(a) for any solution (x∗, w∗) ∈ X×W of the KKT system (3.2) and k ≥ 1, we have that∥∥(xk+1e ;wk+1

e )∥∥2

Ω−∥∥(xke ;wke )

∥∥2

Ω

≤ −(

(2− τ)

3τ‖xk+1 − xk‖2 + ‖wk+1 − wk‖2Ξ − 2τσ〈dk, wk+1

e 〉),

(3.17)

where xe := x− x∗, ∀x ∈ X and we := w − w∗, ∀w ∈W;

(b) the sequence (xk, wk) is bounded;

(c) any accumulation point of the sequence (xk, wk) solves the KKT system (3.2);

(d) the whole sequence (xk, wk) converges to a solution to the KKT system (3.2).

Proof. (a) By using (3.14), together with the definitions of Ξ and Θ in (3.10), and the fact that A∗wk+1e =

1τσ (xk+1 − xk), one can get∥∥(xk+1

e ;wk+1e )

∥∥2

Ω−∥∥(xke ;wke )

∥∥2

Ω=(‖xk+1

e ‖2 + ‖wk+1e ‖2Θ

)−(‖xke‖2 + ‖wke‖2Θ

)≤ −τσ

(2(2−τ)σ

3 ‖A∗wk+1e ‖2 + (2−τ)σ

3 ‖A∗wke‖2 + ‖wk+1 − wk‖212 Σh+S

− 2〈dk, wk+1e 〉

)≤ −

((2−τ)

3τ ‖xk+1 − xk‖2 + τσ (2−τ)

3

(σ‖A∗wk+1

e ‖2 + σ‖A∗wke‖2)

+τσ‖wk+1 − wk‖212 Σh+S

− 2τσ〈dk, wk+1e 〉

)≤ −

((2−τ)

3τ ‖xk+1 − xk‖2 + στ‖wk − wk+1‖2(2−τ)σ

6 AA∗+ 12 Σh+S

− 2τσ〈dk, wk+1e 〉

),

which, together with the definition of the linear operator Ξ in (3.14), implies (3.17).

(b) Define xk+1 := xk + τσ(A∗wk+1− c), ∀k ≥ 0. From (3.6), (3.7) and (3.17) one can get that for any k ≥ 0,

∥∥(xk+1e ;wk+1

e )∥∥2

Ω≤∥∥(xke ;wke )

∥∥2

Ω−(

(2− τ)

3τ‖xk+1 − xk‖2 + ‖wk+1 − wk‖2Ξ

).

Meanwhile, one can get that ‖(xk+1e ;wk+1

e )‖Ω ≤ ‖(xke ;wke )‖Ω, ∀ k ≥ 0. Therefore, it holds that∥∥(xk+1e ;wk+1

e )∥∥

Ω≤∥∥(xk+1

e ;wk+1e )

∥∥Ω

+∥∥(xk+1 − xk+1;wk+1 − wk+1

)∥∥Ω

≤∥∥(xke ;wke )

∥∥Ω

+∥∥(τσA∗(wk+1 − wk+1); Θ1/2(wk+1 − wk+1)

)∥∥=∥∥(xke ;wke )

∥∥Ω

+∥∥wk+1 − wk+1

∥∥τ2σ2AA∗+Θ

, ∀k ≥ 0.

(3.18)

13

Page 14: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

From (3.9) we know that ‖wk+1 − wk+1‖2M ≤ 〈M−1/2dk,M1/2(wk+1 − wk+1)〉, so that

‖M1/2(wk+1 − wk+1)‖ ≤ ‖M−1/2dk‖ ≤ ‖M−1/2‖‖dk‖.

Therefore, it holds that

‖wk+1 − wk+1‖ ≤ ‖M−1/2‖‖M1/2(wk+1 − wk+1)‖

≤ ‖M−1/2‖‖M−1/2‖‖dk‖ ≤ ‖M−1‖εk, ∀k ≥ 0.(3.19)

Therefore, by combining (3.18) and (3.19) together we can get∥∥(xk+1e ;wk+1

e )∥∥

Ω≤∥∥(xke ;wke )

∥∥Ω

+√‖τ2σ2AA∗ + Θ‖ ‖M−1‖εk, ∀k ≥ 0.

Hence, the sequence‖(xk+1

e ;wk+1e )‖Ω

is quasi-Fejer monotone, which converges to a unique limit point by

Lemma 3.2. Since Ω defined in (3.16) is positive definite, we further know that the sequence (xk, wk) isbounded.

(c) From (3.17) we know that for any k ≥ 0,

k∑i=0

∥∥(xie;wie)∥∥2

Ω−∥∥(xi+1

e ;wi+1e )

∥∥2

Ω+ 2τσεi‖wi+1

e ‖

≥k∑i=0

(2− τ)

3τ‖xi+1 − xi‖2 + ‖wi+1 − wi‖2Ξ

.

(3.20)

Since (xk, wk) is bounded and εk is summable, it holds that

∞∑i=0

∥∥(xie;wie)∥∥2

Ω−∥∥(xi+1

e ;wi+1e )

∥∥2

Ω+ 2τσεi‖wi+1

e ‖<∞,

which, together with (3.20) and the fact that Ξ 0, implies

limk→∞

(xk+1 − xk) = 0 and limk→∞

(wk+1 − wk) = 0. (3.21)

Suppose that the subsequence (xkj , wkj ) of (xk, wk) converges to some limit point (x∞, w∞). By takinglimits on both sides of (3.11) and (3.12) along with kj and using (3.21) and [44, Theorem 24.6], one can get

0 ∈ ∂ϕ(w∞) +∇h(w∞) +Ax∞ and A∗w∞ − c = 0,

which implies that (x∞, w∞) is a solution to the KKT system (3.2).

(d) Note that (3.17) holds for any (x∗, w∗) satisfying the KKT system (3.2). Therefore, we can choosex∗ = x∞ and w∗ = w∞ in (3.17):

‖(xk+1 − x∞;wk+1 − w∞)‖2Ω ≤ ‖(xk − x∞;wk − w∞)‖2Ω + 2τσ‖wk+1 − w∞‖εk.

Note that wk is bounded. Then, the above inequality, together with Lemma 3.2, implies that the quasi-Fejermonotone sequence ‖(xk − x∞;wk − w∞)‖2Ω converges. Since (x∞, w∞) is a limit point of (xk, wk), onehas that

limk→0‖(xk − x∞;wk − w∞)‖2Ω = 0,

which, together with the fact that Ω 0, implies that the whole sequence (xk, wk) converges to (x∞, w∞).This completes the proof of the theorem.

14

Page 15: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

3.2 Local Convergence Rate

In this section, we present the local convergence rate analysis of Algorithm iPALM. For this purpose, wedenote U := X×W and consider the KKT residual mapping of problem (3.1) defined by

R(u) = R(x,w) :=

(c−A∗w

w − Proxϕ(w −∇h(w)−Ax)

), ∀u = (x,w) ∈ X×W. (3.22)

Note that R(u) = 0 if and only if u = (x,w) is a solution to the KKT system (3.2), whose solution set cantherefore be characterized by K := u |R(u) = 0. Moreover, the residual mapping R has the followingproperty.

Lemma 3.3. Suppose that Assumption 3.1 holds and the sequence uk := (xk, wk) is generated by AlgorithmiPALM. Then, for any k ≥ 0,

‖R(uk+1)‖2 ≤ 1

τ2σ2‖xk − xk+1‖2 +

2‖Σh + S‖τσ

‖wk+1 − wk‖2Θ

+2‖(1− τ−1)A(xk+1 − xk) + dk‖2.(3.23)

Proof. Note that (3.11) holds. Then, one can see that

wk+1 = Proxϕ

(wk+1 + dk −∇h(wk)−A

(xk +

xk+1 − xk

τ

)−(Σh + S

)(wk+1 − wk)

).

By taking the above equality and c−A∗wk+1 = 1τσ (xk − xk+1) into the definition of R(uk+1) in (3.22) and

using the Lipschitz continuity of Proxϕ, one can get

‖R(uk+1)‖2 ≤ 1

τ2σ2‖xk − xk+1‖2 + 2‖(1− τ−1)A(xk+1 − xk) + dk‖2

+2‖∇h(wk+1)−∇h(wk)−(Σh + S

)(wk+1 − wk)‖2.

(3.24)

By using Clarke’s mean value theorem [8, Proposition 2.6.5] we know that for any k ≥ 0 there exists a linear

operator Σk : W→W such that ∇h(wk+1)−∇h(wk) = Σk(wk+1 − wk) with 0 Σk Σh, so that

‖∇h(wk+1)−∇h(wk)−(Σh + S

)(wk+1 − wk)‖2

= ‖(Σh + S − Σk

)(wk+1 − wk)‖2

≤ ‖Σh + S − Σk‖〈wk+1 − wk,(Σh + S − Σk

)(wk+1 − wk)〉

≤ ‖Σh + S‖〈wk+1 − wk,(Σh + S

)(wk+1 − wk)〉

≤ ‖Σh+S‖τσ

⟨wk+1 − wk, τσ

(Σh + S + (2−τ)σ

3 AA∗)

(wk+1 − wk)⟩, ∀k ≥ 0,

(3.25)

where the last inequality comes form the fact that 0 < τ < 2. Then, by using the definition of Θ in (3.14),one can readily see from (3.24) and (3.25) that (3.23) holds. This completes the proof.

To analyze the linear convergence rate of Algorithm iPALM, we shall introduce the following error boundcondition.

Definition 3.1. The KKT residual mapping R defined in (3.22) is said to be metric subregular3 [11, 3.8 [3H]](with the modulus κ > 0) at u ∈ K for 0 ∈ U if there exists a constant r > 0 such that

dist(u,K

)≤ κ‖R(u)‖ , ∀u ∈ u ∈ U | ‖u− u‖ ≤ r. (3.26)

3 This is equivalent to say that R−1 is calm at 0 ∈ U for u ∈ K with the same modulus κ > 0, see [11, Theorem 3H.3].

15

Page 16: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Suppose that Assumption 3.1 holds. We know from (3.14) and (3.15) that Ξ 0. Hence, one can letζ > 0 be the smallest real number such that ζΞ Θ. For notational convenience, we define the followingpositive constants:

ρ := max

6σ2(τ − 1)2‖A∗A‖+ 3

τσ2(2− τ),

2ζ‖Σh + S‖τσ

max

‖Θ‖, 1

, (3.27)

β := max√

ζ,√

3τ/(2− τ), (3.28)

µ :=

√τσ‖Σh + S +

2

3(1 + τ)σAA∗‖ ‖M−1‖. (3.29)

To ensure the local linear rate convergence of Algorithm iPALM, we need extra conditions to control theerror variable dk in each iteration. Hence, we make the following assumption.

Assumption 3.2. There exists an integer k0 > 0 and a sequence of nonnegative real numbers ηk such that

supk≥k0ηk < 1/µ and ‖dk‖ ≤ ηk‖uk − uk+1‖, ∀ k ≥ k0. (3.30)

Now we are ready to present the local convergence rate of Algorithm iPALM.

Theorem 3.2. Suppose that Assumptions 3.1 and 3.2 hold. Let uk = (xk, wk) be the sequence generated byAlgorithm iPALM that converges to u∗ := (x∗, w∗) ∈ K. Suppose that the KKT residual mapping R definedin (3.22) is metric subregular at u∗ for 0 ∈ U with the modulus κ > 0, in the sense that there exists a constantr > 0 such that (3.26) holds with u = u∗. Then, there exists a threshold k > 0 such that for for all k ≥ k,

distΩ

(uk+1,K

)≤ ϑk distΩ

(uk,K

)with ϑk :=

1

1− µηk

(√κ2ρ

1 + κ2ρ+ µηk(1 + β)

). (3.31)

Moreover, if it holds that

supk≥kηk <

1

µ(2 + β)

(1−

√κ2ρ

1 + κ2ρ

), (3.32)

then one has supk≥kϑk < 1, and the convergence rate of distΩ(uk,K) is Q-linear when k ≥ k.

Proof. Denote ue := u− u∗ for all u ∈ U and define xk+1 := xk + τσ(A∗wk+1− c) and uk+1 := (xk+1, wk+1),∀k ≥ 0. Since uk converges to u∗ and dk converges to 0 as k →∞, one has from (3.9) that uk alsoconverges to u∗ as k →∞. Therefore, there exists a threshold k > 0 such that

‖uk+1e ‖ ≤ r and ‖uk+1

e ‖ ≤ r , ∀k ≥ k. (3.33)

According to Lemma 3.3, one can let uk+1 = uk+1 and dk = 0 in (3.23) and use the fact that ζΞ Θ toobtain that

‖R(uk+1)‖2

≤(

2(τ − 1)2‖A∗A‖τ2

+1

(τσ)2

)‖xk+1 − xk‖2 +

2‖Σh + S‖τσ

‖wk+1 − wk‖2Θ

≤ max

6σ2(τ − 1)2‖A∗A‖+ 3

τσ2(2− τ),

2ζ‖Σh + S‖τσ

((2− τ)

3τ‖xk+1 − xk‖2 + ‖wk+1 − wk‖2Ξ

).

(3.34)

Moreover, according to the definition of Ω in (3.16), one has that

dist2Ω

(u,K

)≤ max‖Θ‖, 1dist2

(u,K

), ∀u ∈ U.

16

Page 17: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Then, by using the above inequality, together with (3.26), (3.33) and (3.34), we can obtain with the constantρ > 0 being defined in (3.27) that

dist2Ω(uk+1,K) ≤ κ2 max‖Θ‖, 1‖R(uk+1)‖2

≤ κ2ρ

((2− τ)

3τ‖xk+1 − xk‖2 + ‖wk+1 − wk‖2Ξ

), ∀ k ≥ k.

(3.35)

It is easy to see from (3.17) that, for any k ≥ 0,

dist2Ω(uk+1,K) ≤ dist2

Ω(uk,K)−(

(2− τ)

3τ‖xk+1 − xk‖2 + ‖wk+1 − wk‖2Ξ

). (3.36)

Therefore, by combining (3.35) and (3.36) together we can get

dist2Ω(uk+1,K) ≤ κ2ρ

1 + κ2ρdist2

Ω(uk,K), ∀ k ≥ k. (3.37)

From (3.36) and the fact that ζΞ Θ we know that

dist2Ω(uk,K) ≥ min

(2− τ)

3τ,

1

ζ

‖uk − uk+1‖2Ω, ∀ k ≥ 0.

Therefore, it holds that‖uk − uk+1‖Ω ≤ β distΩ(uk,K), ∀ k ≥ 0, (3.38)

where the constant β > 0 is given in (3.28). By using the triangle inequality, we have that

‖uk −ΠΩK(uk+1)‖Ω ≤ distΩ(uk,K) + ‖ΠΩ

K(uk)−ΠΩK(uk+1)‖Ω, ∀k ≥ 0. (3.39)

Moreover, from [28, Proposition 2.3] we know that

‖ΠΩK(uk)−ΠΩ

K(uk+1)‖2Ω ≤ 〈ΠΩK(uk)−ΠΩ

K(uk+1),Ω(uk − uk+1)〉, ∀ k ≥ 0.

Thus, one has‖ΠΩ

K(uk)−ΠΩK(uk+1)‖Ω ≤ ‖uk − uk+1‖Ω, ∀k ≥ 0,

which together with (3.38) and (3.39), implies that

‖uk −ΠΩK(uk+1)‖Ω ≤ (1 + β)distΩ(uk,K), ∀ k ≥ 0. (3.40)

From the definitions of Θ in (3.14) and Ω in (3.16) we know that

‖uk+1 − uk+1‖2Ω = (τσ)2‖A∗(wk+1 − wk+1)‖2 + ‖wk+1 − wk+1‖2Θ

=⟨wk+1 − wk+1, τσ(Σh + S + 2

3 (1 + τ)σAA∗)(wk+1 − wk+1)⟩.

Based on the above equality, one can see from (3.19) and (3.29) that

‖uk+1 − uk+1‖Ω ≤ µ‖dk‖. (3.41)

Since Assumption 3.2 holds, by using (3.30), (3.41) and the triangle inequality one can get that

‖uk+1 −ΠΩK(uk+1)‖Ω ≤ ‖uk+1 − uk+1‖Ω + distΩ

(uk+1,K

)≤ µ‖dk‖+ distΩ

(uk+1,K

)≤ µηk‖uk − uk+1‖+ distΩ(uk+1,K)

≤ µηk‖uk+1 −ΠΩK(uk+1)‖Ω + µηk‖uk −ΠΩ

K(uk+1)‖Ω + distΩ(uk+1,K), ∀k ≥ k.

17

Page 18: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Then, by using the fact that ‖uk+1 − ΠΩK(uk+1)‖Ω ≥ distΩ(uk+1,K) and (3.40), we can obtain that when

k ≥ 0,(1− µηk)distΩ(uk+1,K) ≤ µηk(1 + β)distΩ(uk,K) + distΩ(uk+1,K),

which, together with (3.37), implies (3.31). Finally, it is easy to see that supk≥kϑk < 1 from (3.30) and(3.32). This completes the proof.

Remark 3.1. Note that if ηk → 0 as k →∞, condition (3.32) holds eventually for k sufficiently large.

3.3 Non-Ergodic Iteration Complexity

With the inequalities established in the previous subsections, one can easily get the following non-ergodiciteration complexity results for Algorithm iPALM.

Theorem 3.3. Suppose that Assumption 3.1 holds. Let uk = (xk, wk) be the sequence generated byAlgorithm iPALM that converges to u∗ := (x∗, w∗) ∈ K. Then, the KKT residual mapping R defined in(3.22) satisfies

min0≤j≤k

‖R(uj)‖2 ≤ %/k and limk→∞

(k · min

0≤j≤k‖R(uj)‖2

)= 0, (3.42)

where the constant % is defined by

% := max

12σ2(τ − 1)2‖A∗A‖+ 3

τσ2(2− τ),

2ζ‖Σh + S‖τσ

e (3.43)

with e := ‖u0e‖2Ω + 2τσ‖Θ−1/2‖(

∑∞j=0 εj)

(‖u0

e‖Ω + µ∑∞j=0 εj

)+ 4

∑∞j=1 ε

2j .

Proof. From (3.17) in Theorem 3.1(a) we know that‖uj+1e ‖Ω ≤ ‖uje‖Ω, ∀ j ≥ 0. Moreover, (3.41) still holds

with µ being given in (3.29), so that

‖uj+1 − uj+1‖Ω ≤ µ‖dj‖, ∀ j ≥ 0.

Therefore,

‖wj+1e ‖Θ ≤ ‖uj+1

e ‖Ω ≤ ‖uje‖Ω + µ‖dk‖ ≤ ‖u0e‖Ω + µ

∞∑j=0

εj , ∀ j ≥ 0.

Consequently, for any k ≥ 0

k∑j=0

〈dj , wj+1e 〉 ≤ ‖Θ−1/2‖

k∑j=0

‖dj‖

‖wj+1e ‖Θ

≤ ‖Θ−1/2‖

∞∑j=0

εj

‖u0e‖Ω + µ

∞∑j=0

εj

.

(3.44)

Also, from (3.17) of Theorem 3.1(a) we know that for any k ≥ 0,

‖u0e‖2Ω ≥ ‖u0

e‖2Ω − ‖uk+1e ‖2Ω =

k∑j=0

(‖uje‖2Ω − ‖uj+1

e ‖2Ω)

≥k∑j=0

(‖wj+1 − wj‖2Ξ +

2− τ3τ‖xj+1 − xj‖2

)− 2τσ

k∑j=0

〈dj , wj+1e 〉.

(3.45)

Moreover, from (3.23) we know that

‖R(uk+1)‖2 ≤ 4σ2(τ − 1)2‖A∗A‖+ 1

τ2σ2‖xk+1 − xk‖2 +

2ζ‖Σh + S‖τσ

‖wk+1 − wk‖2Ξ + 4‖dk‖2.

Therefore, we can get from (3.44) and (3.45) that∑∞j=0 ‖R(uj+1)‖2 ≤ %. From here, we can easily get

required results in (3.42).

18

Page 19: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

4 The Equivalence Property

In this section, we establish the equivalence of an inexact block sGS decomposition based multi-blockindefinite-proximal ADMM for solving problem (1.1) to the inexact indefinite-proximal ALM presented inthe previous section. The iteration scheme of the former has already been briefly sketched in (1.6) in theintroduction. Here we shall formally present it as Algorithm sGS-iPADMM.

Algorithm sGS-iPADMM An inexact block sGS decomposition based indefinite-Proximal ADMM

Let εk be a summable sequence of nonnegative real numbers, τ ∈ (0, 2) be the (dual) step-length, and(x0, y0, z0) ∈ X×dom p×Y2×· · ·×Ys×Z be the given initial point. Choose the self-adjoint linear operatorsDi : Yi → Yi, i = 1, . . . , s. For k = 0, 1, . . . , perform the following steps in each iteration.

Step 1. For i = s, . . . , 2, compute

yk+ 1

2i ≈ arg min

yi∈Yi

Lσ((yk<i; yi; y

k+ 12

>i

), zk; (xk, yk)

)+

1

2‖yi − yki ‖2Di

,

such that there exists δki satisfying ‖δki ‖ ≤ εk and

δki ∈ ∂yiLσ((yk<i; y

k+ 12

i ; yk+ 1

2>i

), zk; (xk, yk)

)+Di

(yk+ 1

2i − yki

).

Step 2. For i = 1, . . . , s, compute

yk+1i ≈ arg min

yi∈Yi

Lσ((yk+1<i ; yi; y

k+ 12

>i

), zk; (xk, yk)

)+

1

2‖yi − yki ‖2Di

,

such that there exists δki satisfying ‖δki ‖ ≤ εk and

δki ∈ ∂yiLσ((yk+1<i ; yk+1

i ; yk+ 1

2>i

), zk; (xk, yk)

)+Di

(yk+1i − yki

).

Step 3. Compute zk+1 ≈ arg minz∈Z

Lσ(yk+1, z; (xk, yk)

), such that ‖γk‖ ≤ εk with

γk := ∇zLσ(yk+1, zk+1; (xk, yk)

)= Gxk − b+ σG(F∗yk+1 + G∗zk+1 − c). (4.1)

Step 4. Compute xk+1 := xk + τσ(F∗yk+1 + G∗zk+1 − c

).

Recall that the KKT system of problem (1.1) is defined by

0 ∈(∂p(y1)

0

)+∇f(y) + Fx, Gx− b = 0, F∗y + G∗z = c. (4.2)

We make the following assumption on problem (1.1) throughout this section.

Assumption 4.1. The solution set to the KKT system (4.2) is nonempty.

Note that if Slater’s constraint qualification (SCQ) holds for problem (1.1), i.e.,(y, z) | y1 ∈ ri (dom p), F∗y + G∗z = c

6= ∅,

then we know from [44, Corollaries 28.2.2 & 28.3.1] that a vector (y, z) ∈ Y × Z is a solution to problem(1.1) if and only if there exists a Lagrangian multiplier x ∈ X such that (x, y, z) is a solution to the KKT

19

Page 20: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

system (4.2). Therefore, Assumption 4.1 holds if the SCQ holds and (1.1) has at least one optimal solution.Moreover, for any (x, y, z) ∈ X×Y ×Z satisfying (4.2), we know from [44, Corollary 30.5.1] that (y, z) is anoptimal solution to problem (1.1) and x is an optimal solution to its dual problem.

Recall that the majorized augmented Lagrangian function of problem (1.1) was given in (1.5). Note thatone can always write Fx = (F1x; . . . ;Fsx), ∀x ∈ X with each Fi : X → Yi being a given linear mapping.

For later discussions, we symbolically decompose the self-adjoint linear operator Σf in the following from

Σf =

Σf11 Σf12 · · · Σf1s

Σf21 Σf22 · · · Σf2s

......

. . ....

Σfs1 Σfs2 · · · Σfss

with Σfij : Yj → Yi, ∀1 ≤ i, j ≤ s. (4.3)

Based on the above decomposition, we make the following assumption on choosing the proximal terms inAlgorithm sGS-iPADMM .

Assumption 4.2. The self-adjoint linear operators Di : Yi → Yi, i = 1, . . . , s in Algorithm sGS-iPADMMare chosen such that

1

2Σfii + σFiF∗i +Di 0 and D := Diag(D1, . . . ,Ds) −

1

2Σf . (4.4)

We are now ready to prove the equivalence of Algorithm 1 and Algorithm sGS-iPADMM for solvingproblem (1.1). We begin by applying the inexact block sGS decomposition technique in [32, Theorem 1] toexpress the procedure for computing yk+1 in Steps 1 and 2 of Algorithm sGS-iPADMM in a more compactfashion. For this purpose we define the following linear operator

N := Σf + σFF∗ +D. (4.5)

Note that the self-adjoint linear operator N is positive semidefinite, if Assumption 4.2 holds. Moreover, ascan be seen from (1.5), for any given (x, y′, z) ∈ X×Y ×Z, the linear operator N contains all the quadraticinformation of

Lσ (y, z; (x, y′)) +1

2‖y − y′‖2D

with respect to y. Based on (4.3), the linear operator N can be decomposed as N = Nd +Nu +N ∗u with Ndand Nu being the block-diagonal and the strict block-upper triangular parts of N , respectively, i.e.,

Nd := Diag(N11, . . . ,Nss) with Nii := Σfii + σFiF∗i +Di, i = 1, . . . , s

and

Nu :=

0 N12 · · · N1s

0 0. . .

...

......

. . . N(s−1)s

0 0 · · · 0

with Nij = Σfij + σFiF∗j , ∀ 1 ≤ i < j ≤ s. (4.6)

For convenience, we denote in Algorithm sGS-iPADMM for each k ≥ 0, δk1 := δk1 , δk := (δk1 , δ2k . . . , δ

ks ) and

δk := (δk1 , . . . , δks ). Suppose that Assumption 4.2 holds. We can define the sequence δksGS ∈ Y by

δksGS := δk +NuN−1d (δk − δk). (4.7)

20

Page 21: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Moreover, we can define the linear operator

NsGS := NuN−1d N

∗u . (4.8)

Based on the above definitions, we have the following result, which is a direct consequence of [32, Theorem 1].

Lemma 4.1. Suppose that Assumption 4.2 holds. The iterate yk+1 in Step 2 of Algorithm sGS-iPADMM isthe unique solution to the perturbed proximal minimization problem given by

yk+1 = arg miny∈Y

Lσ(y, zk;

(xk, yk

))+

1

2‖y − yk‖2D+NsGS

− 〈δksGS, y〉. (4.9)

Moreover, it holds that N +NsGS = (Nd +Nu)N−1d (Nd +N ∗u ) 0.

Remark 4.1. From (4.9) one can get the interpretation of the linear operator NsGS defined in (4.8). Thatis, by adding the proximal term 1

2‖y − yk‖2D to the majorized augmented Lagrangian function and conduct

one cycle of the block sGS-type block coordinate minimization via Steps 1 and 2 in Algorithm sGS-iPADMM,the resulted yk+1 is then an inexact solution to the following problem

miny∈Y

Lσ(y, zk;

(xk, yk

))+

1

2‖y − yk‖2D +

1

2‖y − yk‖2NsGS

,

where the proximal term 12‖y − y

k‖2NsGSis generated due to the sGS-type iteration with the linear operator

NsGS being defined by (4.8) and (4.5), while δksGS defined in (4.7) represents the error accumulated from δk

and δk after one cycle of the sGS-type update.

The following elementary result4 will be frequently used later.

Lemma 4.2. The self-adjoint linear operator GG∗ is nonsingular (positive definite) on the subspace Range(G)of Z.

Now, we start to establish the equivalence between Algorithm sGS-iPADMM and Algorithm iPALM.The first step is to show that the procedure of obtaining (yk+1, zk+1) in Algorithm sGS-iPADMM can beviewed as the procedure of getting wk+1 in Algorithm iPALM. For this purpose, we define the block diagonallinear operator T : Y ×Z→ Y ×Z by

T(y; z)

:=

((D +NsGS + σFG∗[GG∗]†GF∗)y

0

), ∀

(y, z)∈ Y ×Z. (4.10)

Moreover, we define the sequence

∆k

in Y by

∆k := δksGS −FG∗[GG∗]†(γk−1 − γk − G(xk−1 − xk)

), k ≥ 0 (4.11)

with the convention that x−1 := x0 − τσ(F∗y0 + G∗z0 − c),

γ−1 := −b+ Gx−1 + σG(F∗y0 + G∗z0 − c).(4.12)

Based on the above definitions and Lemma 4.1, we have the following result.

Proposition 4.1. Suppose that Assumption 4.2 holds. Then,

(a) Algorithm sGS-iPADMM is well-defined;

4This lemma can be directly verified via the singular value decomposition of the linear operator G and some basic calculationsfrom linear functional analysis.

21

Page 22: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

(b) the sequence (xk, yk, zk) generated by Algorithm sGS-iPADMM satisfies(∆k; γk

)∈ ∂(y,z)Lσ

((yk+1, zk+1

);(xk, yk

))+ T

(yk+1 − yk; zk+1 − zk

), ∀ k ≥ 0. (4.13)

Proof. (a) Since Assumption 4.2 holds, it is easy to see from Lemma 4.1 that Steps 1 and 2 in algorithmsGS-iPADMM are well-defined for any k ≥ 0. Moreover, from (4.1) we know that Step 3 of AlgorithmsGS-iPADMM is well-defined if, for any k ≥ 0, the following linear system with respect to z

Gxk − b+ σG(F∗yk+1 + G∗z − c) = 0 (4.14)

has a solution. Since b ∈ Range(G), we know that (b − Gxk)/σ − G(F∗yk+1 − c) ∈ Range(G). Therefore,Lemma 4.2 implies that the linear system

GG∗z = (b− Gxk)/σ − G(F∗yk+1 − c)

or equivalently the linear system (4.14), has a solution. Consequently, Algorithm sGS-iPADMM is well-defined.

(b) From (4.1) and (4.12) we know that for any k ≥ 0,

γk−1 = −b+ Gxk−1 + σG(F∗yk + G∗zk − c) (4.15)

so that γk−1 ∈ Range(G) and GG∗zk = (γk−1 + b− Gxk−1)/σ − GF∗yk + Gc. Hence,

GG∗(zk − zk+1) =1

σ(γk−1 − γk − Gxk−1 + Gxk)− GF∗(yk − yk+1), ∀ k ≥ 0.

Therefore, one can get5 that for any k ≥ 0,

σFG∗(zk − zk+1)

= FG∗[GG∗]†(γk−1 − γk − G(xk−1 − xk)) + σFG∗[GG∗]†GF∗(yk+1 − yk).(4.16)

From (4.9) in Lemma 4.1 we know that, for any k ≥ 0,

δksGS ∈ ∂yLσ(yk+1, zk; (xk, yk)

)+(D +NsGS

)(yk+1 − yk)

= ∂yLσ(yk+1, zk+1;

(xk, yk

))+(D +NsGS

)(yk+1 − yk) + σFG∗

(zk − zk+1

).

(4.17)

Then, by substituting (4.16) into (4.17) and using the definition of ∆k in (4.11), one has that

∆k ∈ ∂yLσ(yk+1, zk+1; (xk, yk)

)+(D +NsGS + σFG∗[GG∗]†GF∗

)(yk+1 − yk),

which, together with (4.1), implies that (4.13) holds. This completes the proof.

The following important result will be used later.

Proposition 4.2. Suppose that Assumptions 4.1 and 4.2 hold. Let (xk, yk, zk) be the sequence generatedby Algorithm sGS-iPADMM. Define ξ0 := ‖b− Gx0‖ and

ξk := |1− τ |kξ0 + τ∑ki=1 |1− τ |k−iεi−1, ∀ k ≥ 1.

Then, it holds that for all k ≥ 0, ‖b− Gxk‖ ≤ ξk and

∞∑k=0

‖b− Gxk‖ ≤∞∑k=0

ξk < +∞.

5This can be routinely derived by using the singular value decomposition of G and the definition of the Moore-Penrosepseudoinverse.

22

Page 23: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Proof. We know from Step 4 of Algorithm sGS-iPADMM and (4.12) that

xk = xk−1 + τσ(F∗yk + G∗zk − c

), ∀ k ≥ 0.

Hence, one has thatb− Gxk = b− Gxk−1 − τσG

(F∗yk + G∗zk − c

), ∀ k ≥ 0.

Moreover, from (4.12) and (4.15) we know that

τ(γk−1 + b− Gxk−1) = τσG(F∗yk + G∗zk − c

), ∀k ≥ 0.

Thus, by combining the above two equalities together, one can get

b− Gxk = b− Gxk−1 − τ(γk−1 + b− Gxk−1

)= (1− τ)

(b− Gxk−1

)− τγk−1, ∀ k ≥ 0.

Consequently, it holds that

‖b− Gxk‖ ≤ |1− τ | ‖b− Gxk−1‖+ τ‖γk−1‖, ∀ k ≥ 0,

and hence

‖b− Gxk‖ ≤ |1− τ |k ‖b− Gx0‖+ τ

k∑i=1

|1− τ |k−i ‖γi−1‖ ≤ ξk, ∀k ≥ 0. (4.18)

Note that τ ∈ (0, 2). It is easy to see that

∞∑k=0

‖b− Gxk‖ ≤∞∑k=0

ξk ≤

( ∞∑k=0

|1− τ |k)ξ0 + τ

∞∑k=1

k∑i=1

|1− τ |k−iεi−1

( ∞∑k=0

|1− τ |k)ξ0 + τ

( ∞∑k=0

|1− τ |k)( ∞∑

i=0

εi

)< +∞,

which completes the proof.

Now, we start to show that the sequence (xk, yk, zk) generated by Algorithm sGS-iPADMM can beviewed as a sequence generated by Algorithm iPALM from the same initial point. For this purpose, we definethe space V := Y × Range(G), and we define the linear operators B : X→ V and P : V→ V by

Bx :=(Fx;Gx

), ∀x ∈ X and P(y, z) :=

(Σfy ; 0

), ∀ (y, z) ∈ V. (4.19)

Moreover, we define the closed proper convex function φ : V→ (−∞,+∞] by

φ(v) = φ(y, z) := p(y1) + f(y)− 〈b, z〉, ∀ v = (y, z) ∈ V

and defineLσ (v; (x, v′)) := Lσ (y, z; (x, y′)) , ∀ v = (y, z) ∈ V, v′ = (y′, z′) ∈ V. (4.20)

Based on the above definitions, problem (1.1) can be viewed as an instance of problem (3.1). In this case, thefollowing result is for the purpose of viewing Algorithm sGS-iPADMM as an instance of Algorithm iPALM.

Theorem 4.1. Suppose that Assumptions 4.1 and 4.2 hold. Let (xk, yk, zk) be the sequence generated byAlgorithm sGS-iPADMM. Define

vk :=(yk; ΠRange(G)(z

k)), ∀ k ≥ 0. (4.21)

Then, for any k ≥ 0, it holds that

23

Page 24: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

(a) the linear operators T , B and P defined in (4.10) and (4.19) satisfy

T − 12P and

⟨v,(

12P + σBB∗ + T

)v⟩> 0, ∀ v ∈ V \ 0; (4.22)

(b) there exists a sequence of nonnegative real numbers εk, such that

‖(∆k; γk)‖ ≤ εk and

∞∑k=0

εk < +∞;

(c) it holds that

vk+1 ≈ arg minv∈V

Lσ(v; (xk, vk)

)+

1

2‖v − vk‖2T

in the sense that(

∆k; γk)∈ ∂vLσ

(vk+1;

(xk, vk

))+ T

(vk+1 − vk

)and

∥∥(∆k; γk)∥∥ ≤ εk.

Proof. (a) According to (4.4) in Assumption 4.2 we know that D − 12 Σf . Moreover, from (4.8) we know

that NsGS 0. Thus, one can readily see from (4.10) and (4.19) that T − 12P . On the other hand, one can

symbolically do the decomposition that

1

2P + σBB∗ + T =

12 Σf + σFF∗ +D +NsGS + σFG∗[GG∗]†GF∗ σFG∗

σGF∗ σGG∗

.

From Lemma 4.2, we know that GG∗ is nonsingular on the Range(G). Therefore, by using the definition of Vand the Schur complement condition for ensuring the positive definiteness of a linear operator, we only needto show that 1

2 Σf + σFF∗ +D +NsGS 0 on Y. Suppose on the contrary that it is not positive definite.Then, there exists a nonzero vector y ∈ Y such that⟨

y,(

12 Σf + σFF∗ +D +NsGS

)y⟩

=⟨y,(

12 Σf +D + σFF∗

)y⟩

+ 〈y,NsGSy〉 = 0.

From (4.4) of Assumption 4.2 and (4.8) we know that 12 Σf +D + σFF∗ 0 and NsGS 0, so that⟨

y,(

12 Σf +D + σFF∗

)y⟩

= 0 = 〈y,NsGSy〉 .

Then, by using (4.8) we can get that N ∗uy = 0. This, together with (4.6), implies that

0 =⟨y,(

12 Σf +D + σFF∗

)y⟩

= 12

⟨y,(

Σf + σFF∗)y⟩

+⟨y,(

12σFF

∗ +D)y⟩

= 12

⟨y,(

Σf + σFF∗)dy⟩

+⟨y,(

12σFF

∗ +D)y⟩

=⟨y,(

12 (Σf )d +D

)y⟩

+ σ2 〈y, (FF

∗)dy〉+ σ2 〈y,FF

∗y〉 ,

(4.23)

where(Σf + σFF∗)d := Diag

(Σf11 + σF1F∗1 , . . . , Σfss + σFsF∗s

),

(FF∗)d := Diag(F1F∗1 , . . . ,FsF∗s

).

Since D − 12 Σf implies 1

2 (Σf )d +D 0, we obtain from (4.23) that⟨y,(

12 (Σf )d +D

)y⟩

= σ2 〈y, (FF

∗)d y〉 = σ2 〈y,FF

∗y〉 = 0,

24

Page 25: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

which contradicts the requirement in Assumption 4.2 that 12 Σfii + σFiF∗i + Di 0 for all i = 1, . . . , s.

Therefore, it holds that 12 Σf + σFF∗ +D +NsGS 0, and this completes the proof of (a).

(b) From the definition of ∆k in (4.11) one can see that for all k ≥ 0,

‖∆k‖ ≤ ‖δksGS‖+ ‖FG∗[GG∗]†‖‖γk−1 − γk − G(xk−1 − xk)‖.

Then, by using the fact that max‖δki ‖, ‖δki ‖, ‖γk‖ ≤ εk, we can get from Proposition 4.2 and the definitionof δksGS in (4.7) that for all k ≥ 1,

‖(∆k; γk)‖ ≤ ‖γk‖+ ‖∆k‖

≤ εk := (s+ 1)εk + 2s‖NuN−1d ‖εk + ‖FG∗[GG∗]†‖

(εk−1 + ξk−1 + εk + ξk

).

Moreover, we define ε0 := ‖(∆0; γ0)‖. Then, according to Proposition 4.2 and the fact that the sequenceεk is summable, we know that

∑∞k=0 εk < +∞.

(c) According to (4.10), (4.13) and (4.20), we only need to show that

∂(y,z)Lσ((yk+1, zk+1

);(xk, yk

))= ∂(y,z)Lσ

((yk+1,ΠRange(G)(z

k+1))

;(xk, yk

)), ∀ k ≥ 0.

From (1.4) and (1.5) we can get that

∂yLσ(y, z; (x, y′)) =

(∂y1p(y1)

0

)+∇f(y′) + Σf (y − y′) + Fx+ σF(F∗y + G∗z − c)

and∇zLσ(y, z; (x, y′)) = −b+ Gx+ σG(F∗y + G∗z − c).

Therefore, by using the fact that G∗zk+1 = G∗ΠRange(G)(zk+1), ∀ k ≥ 0, we know that part (c) of the theorem

holds. This completes the proof.

Remark 4.2. One can see that in Algorithm sGS-iPADMM, the sequence (xk, yk, zk) was generated, whilethe sequence ΠRange(G)(z

k) has never been explicitly calculated. Note that once zk is computed, only the

vector G∗zk is needed during the next iteration, instead of zk itself. Since G∗zk = G∗ΠRange(G)(zk),∀ k ≥ 0,

one may view the sequence ΠRange(G)(zk) ∈ Range(G) as a shadow sequence of zk. It has never been

explicitly computed, but still plays an important role on establishing the convergence of the algorithm. In fact,similar observations have been made and extensively used in [30, 31].

By combining the results of Theorem 3.1 and Theorem 4.1, one can readily get the following convergencetheorem of Algorithm sGS-iPADMM.

Theorem 4.2. Suppose that Assumptions 4.1 and 4.2 hold. Let (xk, yk, zk) be the sequence generated byAlgorithm sGS-iPADMM. Then,

(a) the sequence(yk,ΠRange(G)(z

k))

converges to a solution to problem (1.1) and the sequence xkconverges to a solution to the dual of (1.1);

(b) any accumulation point of the sequence (yk, zk) is a solution to problem (1.1);

(c) the sequence p(yk1 ) + f(yk)− 〈b, zk〉 of the objective values converges to the optimal value of problem(1.1), and

limk→∞

(F∗yk + G∗zk − c) = 0;

(d) it holds with K being the solution set to the KKT system (4.2) that

limk→∞

dist((xk, yk, zk),K

)= 0;

25

Page 26: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

(e) if the linear operator G is surjective, the whole sequence (xk, yk, zk) converges to a solution to theKKT system (4.2) of problem (1.1).

Proof. (a) Note that the sequencevk =

(yk; ΠRange(G)(z

k))

defined in (4.21) lies in Y × Range(G). By

using Theorem 4.1(c), one can treat the sequence (xk, vk) generated by Algorithm sGS-iPADMM as theone generated by Algorithm iPALM with the given initial point (x0, v0). In addition, (4.22) in Theorem 4.1guarantees that condition (3.5) in Assumption 3.1 holds. Thus, by Theorem 3.1, the sequence (xk, vk)converges to a solution to the KKT system (4.2), i.e., the sequences

(yk,ΠRange(G)(z

k))

and xk convergeto a solution to problem (1.1) and its dual, respectively.

(b) From (a), we see that limk→∞(xk, yk,ΠRange(G)(zk)) = (x∗, y∗, z∗) which is a solution to the KKT system

(4.2). Since G∗zk = G∗ΠRange(G)(zk),∀ k ≥ 1, any accumulation point, say z∞ of zk satisfies G∗z∞ = G∗z∗.

Then, it is easy to verify that (x∗, y∗, z∞) also satisfy the KKT system (4.2). Therefore, (y∗, z∞) is a solutionto problem (1.1).

(c) From (a) and the fact that the objective function of problem (1.1) is continuous on its domain, we knowthat p(yk1 ) +f(yk)−〈b,ΠRange(G)(z

k)〉 converges to the optimal value of problem (1.1). Since b ∈ Range(G),

it holds that for any k ≥ 1, 〈b, zk〉 = 〈b, ΠRange(G)(zk)〉. Thus,

p(yk1 ) + f(yk)− 〈b,ΠRange(G)(zk)〉 = p(yk1 ) + f(yk)− 〈b, zk〉, ∀k ≥ 1.

Therefore, the sequence p(yk1 ) + f(yk)− 〈b, zk〉 converges to the optimal value of problem (1.1). Meanwhile,since G∗zk = G∗ΠRange(G)(z

k), we further have that

limk→∞

(F∗yk + G∗zk − c

)= limk→∞

(F∗yk + G∗ΠRange(G)(z

k)− c)

= 0.

(d) From (a), we have that (x∗, y∗, z∗), the limit point of (xk, yk,ΠRange(G)(zk)), is a solution to the KKT

system (4.2), i.e., (x∗, y∗, z∗) ∈ K. Since G∗(zk −ΠRange(G)(z

k))

= 0 for any k ≥ 1, it is not difficult to seethat (

x∗, y∗, z∗ +(zk −ΠRange(G)(z

k)))∈ K, ∀k ≥ 1.

Therefore, it holds for all k ≥ 1

dist((xk, yk, zk),K

)≤ ‖xk − x∗‖+ ‖yk − y∗‖+ ‖ΠRange(G)(z

k)− z∗‖

and limk→∞ dist((xk, yk, zk),K

)= 0.

(e) In this case, it holds that Range(G) = Z and zk = ΠRange(G)(zk), ∀ k ≥ 0. The result follows from (a),

which completes the proof of the theorem.

We make the following remark on Theorem 4.2.

Remark 4.3. Without any additional assumptions on G, one can observe that the solution set of problem(1.1) is unbounded and the sequence zk generated by Algorithm sGS-iPADMM may also be unbounded.Fortunately, we are still able to show in Theorem 4.2(a) and (c) that the sequence

(xk, yk,ΠRange(G)(z

k))

converges to a solution to the KKT system (4.2), and both the objective and the feasibility converge to theoptimal value and zero, respectively. Meanwhile, we would like to emphasize that the surjectivity assumptionon G in Theorem 4.2(e) is not restrictive at all. Indeed, this assumption simply means that there are noredundant equations in the linear constraints Gx = b in the primal problem (1.2). If necessary, well establishednumerical linear algebra techniques can be used to remove redundant equations from Gx = b.

4.1 The Two-Block Case

Consider the two-block case that Y = Y1 and f is vacuous, i.e., the following problem

miny,zp(y)− 〈b, z〉 | F∗y + G∗z = c . (4.24)

26

Page 27: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table 1: Comparison between [18] and this paper. In the table ‘SOL’ denotes the solution set to problem(4.24), ‘X’ denotes the set of multipliers (the solution set to the dual problem) to problem (4.24), and ‘K’denotes the solution set to the KKT system or problem (4.24), i.e., K = X× SOL. The symbol → meansthat the sequence on its left-hand-side is convergent, and converges to a point in its right-hand-side.

Item \ Ref [18] This paper

Updating rules z ⇒ y ⇒ x & τ ∈ (0, 2) y ⇒ z ⇒ x & τ ∈ (0, 2)

Assumptions-yp strongly convex p strongly convex p strongly convex

and F the identity operator or F surjective or F surjective

Assumptions-z G surjective - G surjective

Sequences(yk, zk) → SOL dist

((xk, yk, zk),K

)→ 0 (yk, zk) → SOL

xk bounded xk → X xk → X

Assume that the KKT system of problem (4.24) admits a nonempty solution set K. For such a two-blockproblem, Algorithm sGS-iPADMM without the proximal terms and the inexact computations reduces tothe classic ADMM. Then, by Theorem 4.2, the sequence

(xk, yk,ΠRange (G)(z

k))

generated by the classicADMM or its inexact variants with τ ∈ (0, 2) (in the order that the y-subproblem is solved before thez-subproblem) converges to a point in K if either F is surjective or p is strongly convex. Moreover, if G is alsosurjective, we have that the sequence

(xk, yk, zk

)converges to a point in K. Note that the assumptions

we made for problem (4.24) are apparently weaker than those in [18], where F is assumed to be the identityoperator, G is surjective, and p is assumed to be strongly convex. Moreover, in [18, Theorem 3.1], only theconvergence of the primal sequence (yk, zk) and the boundedness of the dual sequence xk were obtained.

The detailed comparison between the results in this paper and those in [18] is presented in Table 1. Ascan be observed from this table, the convergence result on the dual sequence xk is easier to be derivedthan that of the primal sequence (yk, zk), and this result is consistent with the results in [5] for the classicADMM and the ALM in [45]. Hence, the results derived in this paper properly resolves the questions wehave mentioned in the introduction.

At last, we should mention that, in Sun et al. [48, Theorem 3.3 (iv)], a similar result to ours has beenderived with the requirements that the initial multiplier x0 satisfies Gx0 − b = 0 and all the subproblems aresolved exactly. Here, we are able to relax these requirements to the most general case and extend our resultsto the more interesting and challenging multi-block problems.

4.2 Linear Rate of Convergence and Iteration Complexity

Theorem 3.2 has provided a tool, which can be used together with Theorem 4.1 to analyze the linearconvergence rate of the sequence generated by Algorithm sGS-iPADMM, i.e., one only need to verify whether(3.30) is valid for this sequence, provided that the metric subregular property (3.26) holds. However, such averification is not as straightforward as it conceptually seems.

Here, we establish a linear convergence result for the case that the linear system in step 3 of AlgorithmsGS-iPADMM is solve exactly, but leave the general cases as a topic for further study. For this purpose, we

27

Page 28: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

view problem (1.1) as an instance of (3.1) withϕ(w) := p(y1),

h(w) := f(y)− 〈b, z〉,

A∗w := F∗y + G∗z,

∀w = (y, z) ∈W := Y ×Z. (4.25)

Then, the corresponding KKT residual mapping of problem (1.1) can be given by (3.22). Moreover, the

self-adjoint linear operator Ω defined in (3.16) is given by Ω(x; (y; z)) = (x; Θ12 (y; z)), where Θ = τσ(P + T +

(2−τ)σ3 AA∗) with T and P being defined in (4.10) and (4.19), respectively. In fact, we further have that

Ω(x; (y; z)

)= Ω

(x;(y; ΠRange (G)(z)

)), ∀(x, y, z) ∈ X×Y ×Z. (4.26)

Theorem 4.3. Suppose that Assumptions 4.1 and 4.2 hold. Let uk = (xk, wk) with wk := (yk; zk) bethe sequence generated by Algorithm sGS-iPADMM such that vk :=

(xk, yk,ΠRange (G)(z

k)) converges to

v∗ ∈ K. It holds thatdistΩ(uk,K) = distΩ(vk,K), ∀k ≥ 0. (4.27)

Suppose that b − Gx0 = 0 and γk = 0 for all k ≥ 0. Suppose that the KKT residual mapping R definedin (3.22) (with the notation in (4.25)) is metric subregular at v∗ for 0 ∈ U with the modulus κ > 0, inthe sense that there exists a constant r > 0 such that (3.26) holds with u = v∗. Let ηk be a givensequence of nonnegative numbers that converges to 0 in the limit. Suppose that in addition to satisfyingmax‖δki ‖, ‖δki ‖ | i = 1, . . . , s ≤ εk, there exists an integer k0 > 0 such that for any k ≥ k0, it holds that

max1≤i≤s

‖δki ‖, ‖δki ‖

≤ ηk‖vk − vk+1‖. (4.28)

Then, for all k sufficiently large, it holds that distΩ(uk+1,K) ≤ ϑk distΩ(uk,K) with supk≥k0ϑk < 1, i.e., the

convergence rate of distΩ(uk,K) is Q-linear when k is sufficiently large.

Proof. By (4.26), we have that for all k ≥ 0,

dist2Ω(uk,K) = inf

u∈K

1

2〈uk − u, Ω(uk − u)〉 = inf

u∈K

1

2〈uk − u, Ω(vk)− Ω(u)〉

= infu∈K

1

2〈vk − u, Ω(vk − u)〉 = dist2

Ω(vk,K),

i.e., (4.27) holds. Since b− Gx0 = 0 and γk = 0 for all k ≥ 0, according to (4.18) one has that

‖b− Gxk‖ ≤ |1− τ |k ‖b− Gx0‖+ τ

k∑i=1

|1− τ |k−i ‖γi−1‖ = 0, ∀k ≥ 0.

Therefore, by (4.7) and (4.11) one knows that

∆k := δksGS −FG∗(GG∗)−1G(xk − xk−1) = δk +NuN−1d (δk − δk).

Thus, we can get that for all k ≥ 0,

‖dk‖ = ‖∆k‖ ≤ (1 + 2‖NuN−1d ‖) max‖δ‖, ‖δ‖

≤√s(1 + 2‖NuN−1

d ‖)ηk‖vk − vk+1‖,

where dk := (∆k; γk) ∈ W. Define ηk =√s(1 + 2‖NuN−1

d ‖)ηk. Then, it holds that ηk → 0 and ‖dk‖ ≤ηk‖vk − vk+1‖. Therefore, by Theorem 3.2, we know that for all k sufficiently large

distΩ(vk,K) ≤ ϑk distΩ(vk,K)

28

Page 29: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

with supk≥k0ϑk < 1, which, together with (4.27), implies

distΩ(uk,K) ≤ ϑk distΩ(uk,K) for all k sufficiently large.

This completes the proof.

Remark 4.4. Note that, different from the condition (3.30) in Assumption 3.2, the condition (4.28) here isgenerally not directly verifiable during the numerical implementation. However, Theorem 4.3 does provide us avery important theoretical guideline on implementing Algorithm sGS-iPADMM, i.e., in the k-th iteration, it islikely to be beneficial to solve the subproblems to an accuracy higher than the dual feasibility ‖F∗yk+G∗zk−c‖.In fact, this phenomenon has already been observed during our numerical experiments. We should alsomention that even for the 2-block case, the study on the linear convergence of inexact ADMMs with shorter

step-length τ ∈(0, 1+

√5

2

)is still not as mature as the study for their exact counterparts, especially when

compared with the recently developed results, e.g., in [22, 57]. Suitable criteria that generalize the condition(3.30) for terminating the subproblems are still lacking. We note that the results presented in Theorem 4.3are still far from complete, and more effort should be spent on this part in the future.

Finally, different from the above discussions on the convergence rate, we can establish the following non-ergodic iteration complexity for the sequence generated by Algorithm sGS-iPADMM by a direct applicationof Theorem 4.1.

Theorem 4.4. Suppose that Assumptions 4.2 and 4.1 hold. Let uk = (xk, wk) with wk := (yk; zk) bethe sequence generated by Algorithm sGS-iPADMM such that vk :=

(xk, yk,ΠRange (G)(z

k)) converges to

v∗ ∈ K. It holds that the KKT residual (3.22), with B and P given by (4.19), satisfies

min0≤j≤k

‖R(uj)‖2 ≤ %/k, and limk→∞

k · min0≤j≤k

‖R(uj)‖2 = 0,

where the constant % is defined as in (3.43) but with

e := ‖u0e‖2Ω + 2τσ‖Θ−1/2‖

(∑∞j=0 εj

)(‖u0

e‖Ω + µ∑∞j=0 εj

)+ 4

(∑∞j=1 ε

2j

)and u0

e = u0 − v∗.

Proof. From (4.26), we know that

‖u0 − v∗‖2Ω = 〈Ω(u0 − v∗), u0 − v∗〉 = 〈Ω(v0)− Ω(v∗), u0 − v∗〉 = ‖v0 − v∗‖2Ω. (4.29)

According to (3.22), (4.2) and (4.25), one has that

R(u) =

c−F∗y + G∗zy − ProxP (y −∇f(y)−Fx)

Gx− b

, ∀u = (x, y, z) ∈ X×Y ×Z.

Since for all k ≥ 0, G∗zk = GΠRange(G∗)(zk), one has that R(uk) = R(vk). Therefore, by using (4.22) in

Theorem 4.1(a), Theorem 3.3 and (4.29), one has the results of this theorem holds. This completes theproof.

5 Numerical Experiments

In this section, we conduct numerical experiments on solving dual linear SDP and dual convex quadraticSDP problems via Algorithm sGS-iPADMM with the dual step-length τ taking values beyond the standardrestriction of (1 +

√5)/2. For linear SDP problems, the algorithm reduces to the two-block ADMM, and the

29

Page 30: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

aim is two-fold. On the one hand, as the ADMM is among the most important first-order algorithms forsolving SDP problems, it is of importance to know to what extent can the numerical efficiency be improved ifthe observation on the dual step-length made in this paper is incorporated. On the other hand, as the upperbound of the step-length has been enlarged, it is also important to see whether a step-length that is veryclose to the upper bound will lead to better or worse numerical performance.

A standard linear SDP problem has the following form:

minX〈C,X〉 | AX = b, X ∈ Sn+ (5.1)

and its corresponding dual is given as in (2.4). To avoid repetition, we refer the reader to (2.4) for thenotation used. The (majorized) augmented Lagrangian function associated with problem (2.4) is given by

Lσ(S, z;X) = δSn+(S)− 〈b, z〉+ 〈X,S + A∗z− C〉+ σ2 ‖S + A∗z− C‖2,

∀(S, z, X) ∈ Sn ×Rm × Sn,

where σ > 0 is the given penalty parameter. When applied to solving problem problem (2.4), at the k-thstep of the two-block ADMM the following steps are performed:

Sk+1 = ΠSn+(C −A∗zk −Xk/σ),

zk+1 = (AA∗)−1(A(C − Sk+1)− (AXk − b)/σ

),

Xk+1 = Xk + τσ(Sk+1 + A∗zk+1 − C),

where the step-length τ is allowed to be in the range (0, 2) based on Theorem 4.1 and the discussions inSection 4.1. We emphasize again that this is in contrast to the usual interval of (0, (1 +

√5)/2) allowed by

the convergence analysis of Glowinski in [20, Theorem 5.1].On the other hand, as was briefly introduced in Section 2.1, the convex QSDP problem was formally

given in (2.1), whose dual problem, in minimization form, is a multi-block problem given by

minS,W,zE ,zI

δSn+(S) + δRmI+(zI) + 1

2 〈W,QW 〉 − 〈bE , zE〉 − 〈bI , zI〉

s.t. S −QW + A∗EzE + A∗IzI + C = 0.(5.2)

Note that problem (2.1) was subsumed as an instance of the convex quadratic composition optimizationproblem (1.7). Therefore, to fit the framework of Algorithm sGS-iPADMM, we write the dual of problem(2.1) in the minimization form as

minS,W,s,zE ,zI

δSn+(S) + δRmI+(s) + 1

2 〈W,QW 〉 − 〈bE , zE〉 − 〈bI , zI〉

s.t.

S −QW + A∗EzE + A∗IzI + C = 0,

D(s− zI) = 0,

(5.3)

where D ∈ RmI×mI is a given positive definite diagonal matrix which is incorporated here for for the purposeof scaling the variables to ensure the numerical stability.

The convex QSDP problem (2.1) is solved via its dual (5.3), whose (majorized) augmented Lagrangianfunction is defined by

Lσ(S,W, zE , zI , s;X,x) :=(δSn+(S) + δRmI+

(s))

+ 12 〈W, QW 〉 − 〈bE , zE〉 − 〈bI , zI〉

+〈X,S −QW + A∗EzE + A∗IzI + C〉+ 〈D(s− zI),x〉

+σ2 ‖S −QW + A∗EzE + A∗IzI + C‖2 + σ

2 ‖D(s− zI)‖2,

∀(S,W, zE , zI , s;X,x) ∈ Sn × Sn ×RmE ×RmI ×RmI × Sn ×RmI .

30

Page 31: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

where σ > 0 is the given penalty parameter and and we have used X ∈ Sn and x ∈ RmI to denote theLagrange multipliers which are introduced for the two groups of equality constraints in (5.3). During the k-thiteration of Algorithm sGS-iPADMM with given (Sk,W k, zkE , z

kI , s

k) and (Xk,xk), we update the variablesin the order of(

zk+1/2E ⇒W k+1/2︸ ︷︷ ︸backward GS

⇒ (Sk+1, sk+1)⇒W k+1 ⇒ zk+1E︸ ︷︷ ︸

forward GS

)⇒ zk+1

I ⇒(Xk+1,xk+1

)︸ ︷︷ ︸

τ∈(0,2)

.

Note that the term 〈bI , zI〉 is treated as the linear term in the framework of (1.1). We made this choicebecause for the test instances that we will consider later, the linear system that must be solved to update zIis much larger than that for updating zE , and in this way, the larger linear system will be solved only once ineach iteration.

The numerical results in the subsequent two subsections are obtained by using Matlab R2017b on aHP Elitedesk (64-bit Windows 10 system) with one Intel Core i7-4770S Processor (4 Cores, 3.1− 3.9 GHz)and 16 GB RAM (with the virtual memory turned off).

5.1 Numerical Results on Linear SDP Problems

Based on the first-order optimality condition for problem (5.1), we terminate all the tested algorithms if

ηSDP := maxηD, ηP , ηS ≤ 10−6,

where ηD =

‖A∗z + S − C‖1 + ‖C‖

, ηP =‖AX − b‖

1 + ‖b‖,

ηS = max

‖X −ΠSn+(X)‖

1 + ‖X‖,

|〈X,S〉|1 + ‖X‖+ ‖S‖

with the maximum number of iterations set at 100, 000. In addition, we also measure the duality gap:

ηgap :=〈C,X〉 − 〈b, z〉

1 + |〈C,X〉|+ |〈b, z〉|.

During our preliminary tests, we found that using a step-length smaller than 1 is not as good as using the unitstep-length. Therefore, we shall only consider the cases that τ ≥ 1. Note that the known theoretical upper

bound of the step-length τ in the classic ADMM for solving general convex programming is 1+√

52 ≈ 1.618034.

Although it has been observed empirically that the ADMM with the step-length τ = 1.618 works quite well,this phenomenon still requires further understanding since the value 1.618 is quite close to the theoreticalupper bound and such an aggressive choice may result in unstable numerical performance. Fortunately, theabove concern is partially alleviated by the theoretical results obtained in this paper. Indeed, for a large classconvex optimization problems, one can use τ = 1.618 confidently since it has a “safe” distance to the renewedtheoretical upper bound of 2. For this class of problems, it is thus very interesting to see what would happenif the step-length τ is very close to 2. Therefore, we tested five choices of the step-length, i.e., τ = 1, 1.618,1.90, 1.99 and 1.999. For convenience, we use ADMM(τ) to denote the algorithm with the specific step-lengthτ .

We tested 6 categories of linear SDP problems, including the random sparse SDP problems tested in[34], the semidefinite relaxation of frequency assignment problems (FAP) [15], the relaxation of maximumstable set problems [50, 52, 47], the SDP relaxation of binary integer quadratic (BIQ) problems from [54],the SDP relaxation of rank-1 tensor approximations (R1TA) [38, 39], and the SDP relaxations of clusteringproblems [40]. One may refer to [58, 56] for detailed descriptions and the data sources of these problems.All these algorithms are tested by running the Matlab package SDPNAL+ (version 1.0, available athttp://www.math.nus.edu.sg/~mattohkc/SDPNALplus.html). The records of the computational results

31

Page 32: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

0 0.2 0.4 0.6 0.8 1(100y)% of the problems

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2

Itera

tion

num

ber

ratio

(co

mpa

red

to A

DM

M(1

))

Efficiency: ratio of iteration numbers

ADMM(1.618)ADMM(1.90)ADMM(1.99)ADMM(1.999)

0 0.2 0.4 0.6 0.8 1(100y)% of the problems

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2

Itera

tion

num

ber

ratio

(co

mpa

red

to A

DM

M(1

.618

))

Efficiency: ratio of iteration numbers

ADMM(1)ADMM(1.90)ADMM(1.99)ADMM(1.999)

Figure 1: Comparison of the computational efficiency of the classic two-block ADMM with different step-lengths

are provided in Table 3. Here, we should mention that even though all the problems we tested have beensuccessfully solved by at least one of the tested algorithms, there are a few categories of SDP problems thatare beyond the capability of the ADMM, see, e.g., [58].

Figure 1 presents the computational performance of the ADMM with all the five choices of step-lengths.The left panel shows the comparison between ADMM(1) and all the other algorithms, while the right panelshows the comparison between ADMM(1.618) and all the others. As can be seen from Figure 1, ADMM(1.618)has an impressive improvement over ADMM(1) and ADMM(1.9) works even better than ADMM(1.618) formore than 80% of the tested instances. Furthermore, ADMM(1.99) can perform marginally better thanADMM(1.9) for about 60% of the tested instances but for about 10% of them, its performance is apparentlyworse. However, ADMM(1.999) has a significantly worse performance than ADMM(1.99) even though itsstep-length is just slightly larger than 1.99. This can be partially explained by the fact that the step-lengthof 1.999 is too close to the theoretical upper bound of 2.

From both the theoretical analysis and numerical experiments in this paper, one can see that in generalit is a good idea to use a step-length that is larger than 1, e.g., τ = 1.618, when solving linear SDP problems.Meanwhile, we can even set the step-length to be larger than 1.618, say τ = 1.9, to get even better numericalperformance.

32

Page 33: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

5.2 Numerical Results on Convex QSDP Problems

The KKT system of problem (5.2) is given byS −QW + A∗EzE + A∗IzI + C = 0, AEX − bE = 0,

QX −QW = 0, X ∈ Sn+, S ∈ Sn+, 〈X,S〉 = 0,

AIX − bI ≥ 0, zI ≥ 0, 〈AIX − b, zI〉 = 0.

(5.4)

Based on the optimality conditions given in (5.4), we measure the accuracy of an approximate (computed)solution (X,Z,W, S, yE , yI) for the convex QSDP (2.1) and its dual (5.2) via

ηqsdp = max ηD, ηP , ηW , ηS , ηI , (5.5)

where

ηD =‖S −QW + A∗EzE + A∗IzI + C‖

1 + ‖C‖, ηP =

‖AEX − bE‖1 + ‖bE‖

,

ηW =‖QX −QW‖

1 + ‖Q‖, ηS = max

‖X −ΠSn+(X)‖

1 + ‖X‖,

|〈X,S〉|1 + ‖X‖+ ‖S‖

,

ηI = max

‖min(0, zI)‖

1 + ‖zI‖,‖min(0, AIX − bI)‖

1 + ‖bI‖,|〈AIX − bI , zI〉|

1 + ‖AIx− bI‖+ ‖zI‖

.

Additionally, we measure the objective values and the duality gap:

ηgap :=Objprimal −Objdual

1 + |Objprimal|+ |Objdual|,

where

Objprimal :=1

2〈X,QX〉 − 〈C,X〉, and Objdual := −1

2〈W, QW 〉+ 〈bE , zE〉+ 〈bI , zI〉.

In our numerical experiments, similar to [6], we used QSDP test instances based on the SDP problems arisingfrom the relaxation of the binary integer quadratic (BIQ) programming with a large number of inequalityconstraints that was introduced by Sun et al. [48] for getting tighter bounds. The problems that we actuallysolve have the following form:

min1

2〈X, QX〉+

1

2〈Q, X〉+ 〈c, x〉

s.t.

diag(X)− x = 0, X =

(X xxT 1

)∈ Sn+,

xi −Xij ≥ 0, xj −Xij ≥ 0, Xij − xi − xj ≥ −1, ∀ 1 ≤ i < j ≤ n− 1.

The data for Q and c are taken from the Biq Mac Library maintained by Wiegele [54].We solve the QSDP (2.1) via its dual (5.3) with the matrix D = (

√‖AI‖/2)IRmI . We use the directly

extended multi-block ADMM with step-length τ = 1 as the benchmark (which we named as ‘DirectlyExtended’), and compare it with Algorithm sGS-iPADMM, which was implemented in 6 different ways, i.e.,2 groups of algorithms with each using 3 types of different step-lengths, i.e., τ = 1, 1.618 and 1.9, whichare chosen according to the numerical results in Section 5.1. For convenience, we use the name ‘sGS-PADMM’to mean that Algorithm sGS-iPADMM is implemented such that all the subproblems are solved exactlyvia direct solvers or adding appropriate proximal terms, and use ‘sGS-iPADMM’ to mean that AlgorithmsGS-iPADMM is implemented such that the subproblems are allowed to be solved inexactly via iterativesolvers. The details of all the seven tested algorithms are presented in Table 2.

33

Page 34: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table 2: 7 types of algorithms tested for the convex QSDP problems. In the table, ‘Dir’ denotes thecorresponding subproblems are solved via direct solvers, ‘Proj.’ means that the corresponding subproblemsare the calculation of projections, ‘Prox.’ means that the corresponding subproblems are solved via addingappropriate proximal terms to ensure closed-form solutions, ‘Inex.’ means that the subproblems are solvedapproximately via an iterative scheme, ‘(rep.)’ means that the corresponding subproblems in the forwardsGS sweeps are also solved, ‘(check)’ means that the corresponding subproblems in the forward sGS sweepsare not directly solved but the most recently updated variables are checked to see if they are admissibleapproximate solution to the subproblems.

Algorithm \ Variable τ zE W S s zI

Directly Extended 1 Dir. Prox. Proj. Proj. Prox.

sGS-PADMM

1 Dir.(rep.) Prox.(rep.) Proj. Proj. Prox.

1.618 Dir.(rep.) Prox.(rep.) Proj. Proj. Prox.

1.9 Dir.(rep.) Prox.(rep.) Proj. Proj. Prox.

sGS-iPADMM

1 Dir.(check) Inex.(check) Proj. Proj. Inex.

1.618 Dir.(check) Inex.(check) Proj. Proj. Inex.

1.9 Dir.(check) Inex.(check) Proj. Proj. Inex.

For all the algorithms applied to problem (5.3), the subproblems corresponding to the block variable (S, s)can be solved analytically by computing the projections onto Sn+ ×R

mI+ . For the subproblems corresponding

to zE , linear systems of equations must be solved with the same coefficient matrix AEA∗E . As the linearsystems are not too large, we solve them via the Cholesky factorization (computed only once) of AEA∗E . Forthe subproblems corresponding to zI and W , we need to solve very large scale linear systems of equations,so that they are either solved via a preconditioned conjugate gradient method with preconditioners thatare described in [6, Section 7.1] (for sGS-iPADMM), or solved directly by adding an appropriate proximalterm to the subproblems to get closed-form solutions (for sGS-PADMM). Moreover, in the implementationof the sGS-PADMM, all the subproblems in the forward Gauss-Seidel sweep are directly solved, while in theimplementation of sGS-iPADMM we used the strategy described in [6, Remark 4.1(b)] to decide whether thecomputation of the subproblems in the forward GS sweep can be skipped (see [6, Section 7.2] for more detailson using this technique). We used a similar strategy as described in [27, Section 4.4] to adaptively adjustthe penalty parameter σ and used the same technique as in [6] to control the error tolerance for solving thesubproblems, i.e., εkk≥0 is chosen such that αεk ≤ 1/k1.2, where α > 0 is a positive constant based on theproblem data.

We have tested 147 instances of convex QSDP problems with n ranging from 51 to 501. The linearoperator Q was chosen as the symmetrized Kronecker operator Q(X) = 1

2 (AXB + BXA) with A andB being two randomly generated symmetric positive semidefinite matrices such that rank(A) = 10 andrank(B) ≈ n/5, respectively, as was used in [51, 6]. The maximum iteration number is set at 500, 000. Thedetail computational results are provided in Table 4.

34

Page 35: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

1 1.5 2 2.5 3 3.5 4 4.5 5at most x times of the best

0

0.2

0.4

0.6

0.8

1(1

00y)

% o

f the

pro

blem

s

Performance profile: time

Directly ExtendedsGS-PADMM-1sGS-PADMM-1.618sGS-PADMM-1.9sGS-iPADMM-1sGS-iPADMM-1.618sGS-iPADMM-1.9

Figure 2: Comparison of the computational efficiency of the 7 algorithms

Figure 2 shows the numerical performance of the 7 tested algorithms described in Table 2 on solvingthe convex QSDP problems to the accuracy of 10−6 in terms of ηqsdp in (5.5). One can readily see from thefigure that sGS-iPADMM overwhelmingly outperforms sGS-PADMM, no matter which step-length τ was used.This evidently shows the considerable advantage of catering for approximate solutions in the subproblems ofAlgorithm sGS-iPADMM. Moreover, for both sGS-PADMM and sGS-iPADMM, the step-length τ = 1.618 is ableto bring a noticeable improvement on the numerical efficiency, compared with using the unit step-length.Meanwhile, the choice of τ = 1.9 can perform even better in general. Even this is more apparent forsGS-PADMM, in which all the subproblems are solved exactly. We can see that sGS-iPADMM with τ = 1.9performs the best among all the tested algorithms for almost 65% of all the tested problems. Hence, thenumerical results clearly demonstrate the merit of using a larger step-length and the flexibility of inexactlysolving the subproblems.

6 Conclusions

In this paper, we have shown that, for a class of convex composite programming problems, the sequencegenerated by an inexact sGS decomposition based multi-block majorized (proximal) ADMM is equivalent to

35

Page 36: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

the sequence generated by an inexact proximal ALM starting with the same initial point. The convergence ofthe inexact majorized proximal ALM was first established, and the convergence of the multi-block ADMM-typealgorithm follows readily because of the newly discovered equivalence. As a consequence of this equivalence,we are able to provide a very general answer to the open question on whether the whole sequence generated bythe classic ADMM with τ ∈ (0, 2) for a conventional two-block problem with one part of its objective functionbeing linear, is convergent. Numerical experiments on solving a large number of linear and convex quadraticSDP problems are conducted. The numerical results show that one can achieve even better numericalperformance of the ADMM if the step-length is chosen to be larger than the conventional upper bound of(1 +

√5)/2, and one can get a considerable improvement by allowing inexact subproblems together with

the large step-lengths on the multi-block ADMM for convex quadratic SDP problems. We hope that ourtheoretical analysis and numerical results can inspire more insightful studies on the ADMM-type algorithms.

Acknowledgments

We would like to thank the two anonymous referees for their careful reading of this paper, and their insightfulcomments and suggestions which have helped to improve the quality of this paper.

References

[1] Bai, M., Zhang, X., Ni, G. and Cui, C.: An adaptive correction approach for tensor completion. SIAM J. ImagingSci., 9, 1298–1323 (2016)

[2] Bai, S. and Qi, H.-D.: Tackling the flip ambiguity in wireless sensor network localization and beyond. Digit.Signal Process. 55, 85–97 (2016)

[3] Bertsekas, D.P. and Tsitsiklis, J.N.: Parallel and Distributed Computation: Numerical Methods. Athena Scientific,Belmont, Massachusetts (1997)

[4] Boyd, S., Parikh, N., Chu, E., Peleato, B. and Eckstein. J.: Distributed optimization and statistical learning viathe alternating direction method of multipliers, Found. Trends Mach. Learn. 3, 1–122 (2011)

[5] Chen, L., Sun, D.F. and Toh, K.-C.: A note on the convergence of ADMM for linearly constrained convexoptimization problems. Comput. Optim. Appl. 66, 327-343 (2017)

[6] Chen, L., Sun, D.F. and Toh, K.-C.: An effcient inexact symmetric Gauss-Seidel based majorized ADMM forhigh-dimensional convex composite conic programming. Math. Program. 161(1-2), 237–270 (2017)

[7] Chen, S.S., Donoho, D.L. and Saunders, M.A.: Atomic decomposition by basis pursuit. SIAM Rev. 43, 129–159(2001)

[8] Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1983)

[9] Ding, C. and Qi, H.-D.: Convex optimization learning of faithful Euclidean distance representations in nonlineardimensionality reduction. Math. Program. 164(1-2), 341–381 (2017)

[10] Ding, C., Sun, D.F., Sun, J. and Toh K.-C.: Spectral operators of matrices. Math. Program. 168, 509–531 (2018)

[11] Dontchev, A.L. and Rockafellar, R.T.: Implicit Functions and Solution Mappings, Second Edition. Springer, NewYork (2014)

[12] Du, M.Y.: A Two-Phase Augmented Lagrangian Method for Convex Composite Quadratic Programming. PhDthesis, Department of Mathematics, National University of Singapore (2015)

[13] Eckstein, J.: Augmented Lagrangian and alternating direction methods for convex optimization: A tutorial andsome illustrative computational results. RUTCOR Research Reports (2012)

[14] Eckstein, J. and Yao, W.: Understanding the convergence of the alternating direction method of multipliers:Theoretical and computational perspectives. Pac. J. Optim. 11, 619–644 (2015)

[15] Eisenblatter, A., Grotschel, M. and Koster, A.: Frequency planning and ramification of coloring. Discuss. Math.Graph Theory 22, 51–88 (2002)

[16] Fazel, M., Pong, T.K., Sun D.F. and Tseng, P.: Hankel matrix rank minimization with applications to systemidentification and realization. SIAM J. Matrix Anal. 34(3), 946–977 (2013)

36

Page 37: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

[17] Ferreira, J., Khoo, Y. and Singer, A.: Semidefinite programming approach for the quadratic assignment problemwith a sparse graph. Comput. Optim. Appl. 69(3), 677–712 (2018)

[18] Gabay, D. and Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite elementapproximation. Comput. Math. Appl. 2(1), 17–40 (1976)

[19] Gaines, B.R., Kim, J. and Zhou, H.: Algorithms for fitting the constrained lasso, J. Comput. Graph. Stat. 27(4),861–871 (2018)

[20] Glowinski, R.: Lectures on Numerical Methods for Non-Linear Variational Problems. Published for the TataInstitute of Fundamental Research, Bombay. Springer-Verlag (1980)

[21] Glowinski, R. and Marroco, A.: Sur l’approximation, par elements finis d’ordre un, et la resolution, par penalisation-dualite d’une classe de problemes de Dirichlet non lineaires. Revue francaise d’atomatique, Informatique RechercheOperationelle. Analyse Numerique 9(2), 41–76 (1975)

[22] Han, D.R., Sun, D.R. and Zhang, L.W.: Linear rate convergence of the alternating direction method of multipliersfor convex composite programming. Math. Oper. Res. 43(2), 622–637 (2018)

[23] Hestenes, M.: Multiplier and gradient methods. J. Optim. Theory Appl. 4(5), 303–320 (1969)

[24] Huber, P.J.: Robust estimation of a location parameter. Ann. Math. Stat. 35, 73–101 (1964)

[25] James, G.M., Paulson, C. and Rusmevichientong, P.: Penalized and constrained regression. UnpublishedManuscript, the latest version is available at http://www-bcf.usc.edu/~gareth/research/Research.html

(2013)

[26] Klopp, O.: Noisy low-rank matrix completion with general sampling distribution. Bernoulli 20(1), 282–303 (2014)

[27] Lam, X.Y., Marron, J.S., Sun, D.F. and Toh, K.-C.: Fast algorithms for large scale generalized distance weighteddiscrimination. J. Comput. Graph. Stat. 27(2), 368–379 (2018)

[28] Lemarechal, C. and Sagastizabal, C.: Practical aspects of the Moreau-Yosida regularization: theoreticalpreliminaries. SIAM J. Optim. 7(2), 367–385 (1997)

[29] Li, M., Sun, D.F. and Toh, K.-C.: A majorized ADMM with indefinite proximal terms for linearly constrainedconvex composite optimization. SIAM J. Optim. 26(2), 922–950 (2016)

[30] Li, X.D., Sun, D.F. and Toh, K.-C.: A Schur complement based semi-proximal ADMM for convex quadraticconic programming and extensions. Math. Program. 155, 333–373 (2016)

[31] Li, X.D., Sun, D.F. and Toh, K.-C.: QSDPNAL: A two-phase augmented Lagrangian method for convex quadraticsemidefinite programming. Math. Program. Comput. 10(4), 703–743 (2018)

[32] Li, X.D., Sun, D.F. and Toh, K.-C.: A block symmetric Gauss-Seidel decomposition theorem for convex compositequadratic programming and its applications. Math. Program. DOI:10.1007/s10107-018-1247-7 (2018)

[33] Liu, J., Musialski, P., Wonka P. and Ye, J.: Tensor completion for estimating missing values in visual data. IEEETrans. Pattern Anal. Mach. Intell. 35, 208–220 (2013)

[34] Malick, J., Povh, J., Rendl, F. and Wiegele, A.: Regularization methods for semidefinite programming. SIAM J.Optim. 20, 336–356 (2009)

[35] Mateos G., Bazerque, J.-A. and Giannakis G.B.: Distributed sparse linear regression. IEEE Trans. Signal Proces.58, 5262–5276 (2010)

[36] Miao, W.M., Pan, S.H. and Sun, D.F.: A rank-corrected procedure for matrix completion with fixed basiscoefficients. Math. Program. 159, 289–338 (2016)

[37] Negahban, S. and Wainwright, M.J.: Restricted strong convexity and weighted matrix completion: optimalbounds with noise. J. Mach. Learn. Res. 13, 1665–1697 (2012)

[38] Nie, J. and Wang, L.: Regularization methods for SDP relaxations in large-scale polynomial optimization. SIAMJ. Optim. 22, 408–428 (2012)

[39] Nie, J. and Wang, L.: Semidefinite relaxations for best rank-1 tensor approximations. SIAM J. Matrix Anal.Appl. 35, 1155–1179 (2014)

[40] Peng, J. and Wei, Y.: Approximating k-means-type clustering via semidefinite programming. SIAM J. Optim.18, 186–205 (2007)

[41] Potra F.A.: Weighted complementarity problems–a new paradigm for computing equilibria. SIAM J. Optim. 22,1634–1654 (2012)

37

Page 38: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

[42] Powell, M.: A method for nonlinear constraints in minimization problems. In: Fletcher, R. (ed.) Optimization,pp. 283–298. Academic, New York (1969)

[43] Povh, J., Rendl, F. and Wiegele, A.: A boundary point method to solve semidefinite programs. Computing 78,277–286 (2006)

[44] Rockafellar, R.T.: Convex Analysis. Princeton University Press (1970)

[45] Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex program-ming. Math. Oper. Res. 1, 97–116 (1976)

[46] Schizas, I.D., Ribeiro A. and Giannakis G.B.: Consensus in ad hoc WSNs with noisy links - Part I: distributedestimation of deterministic signals. IEEE Trans. Signal Process. 56, 350–364 (2008)

[47] Sloane, N.: Challenge Problems: Independent Sets in Graphs, http://www.research.att.com/~njas/doc/

graphs.html

[48] Sun, D.F., Toh, K.-C. and Yang, L.Q.: A convergent 3-block semi-proximal alternating direction method ofmultipliers for conic programming with 4-type constraints. SIAM J. Optim. 25(2), 882–915 (2015)

[49] Teo, C.H., Vishwanathan, S.V.N., Smola A. and V.Le, Q.: Bundle methods for regularized risk minimization. J.Mach. Learn. Res. 11, 313-365 (2010)

[50] Toh, K.-C.: Solving large scale semidefinite programs via an iterative solver on the augmented systems. SIAM J.Optim. 14, 670-698 (2004)

[51] Toh, K.-C.: An inexact primal-dual path-following algorithm for convex quadratic SDP. Math. Program. 112(1),221–254 (2008)

[52] Trick, M., Chvatal, V., Cook, W., Johnson, D., McGeoch, C. and Tarjan, R.: The Second DIMACS ImplementationChallenge: NP Hard Problems: Maximum Clique, Graph Coloring, and Satisfiability. Rutgers University,http://dimacs.rutgers.edu/Challenges/ (1992)

[53] Wang, B. and Zou, H.: Another look at distance-weighted discrimination. J. R. Stat. Soc. B 80, 177–198 (2018)

[54] Wiegele, A.: Biq Mac library–a collection of Max-Cut and quadratic 0− 1 programming instances of mediumsize. Technical report (2007) http://biqmac.uni-klu.ac.at/biqmaclib.pdf

[55] Yan, Z., Gao, S.Y. and Teo C.P.: On the design of sparse but efficient structures in operations. Manage. Sci. 64,2973–3468 (2018)

[56] Yang, L.Q., Sun, D.F. and Toh, K.-C.: SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangianmethod for semidefinite programming with nonnegative constraints. Math. Program. Comput. 7, 331–366 (2015)

[57] Zhang, N., Wu, J. and Zhang, L.W.: A linearly convergent majorized ADMM with indefinite proximal terms forconvex composite programming and its applications. arXiv: 1706.01698v2 (2018)

[58] Zhao, X.Y., Sun, D.F. and Toh K.-C.: A Newton-CG augmented Lagrangian method for semidefinite programming.SIAM J. Optim. 20, 1737–1765 (2010)

[59] Zhu, H., Cano, A. and Giannakis, G.B.: Distributed consensus-based demodulation: algorithms and error analysis.IEEE Trans. Wirel. Commun. 9, 2044–2054 (2010)

38

Page 39: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Tab

le3:

The

per

form

ance

ofA

DM

Mw

ithτ

=1,

1.61

8,1.

90,1.9

9,1.

999

onso

lvin

gSD

Ppro

ble

ms

(acc

ura

cy=

10−

6).

The

max

imum

iter

atio

nnum

ber

isse

tas

100,

000.

Inth

eta

ble

,th

eti

me

ofco

mp

uta

tion

isin

the

form

at

of

”h

ou

rs:m

inu

tes:

seco

nd

s”if

itis

larg

erth

an

on

em

inu

te,

or

else

itis

reco

rded

by

seco

nd

.

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

theta

41949|2

00

408|3

44|3

14|6

01|3

401

9.8

-7|9

.4-7|9

.9-7|8

.9-7|8

.6-7

9.7

-8|-

5.4

-7|1

.7-6|-

7.3

-10|-

5.5

-92.6|1

.7|1

.6|2

.8|1

5.1

theta

42

5986|2

00

205|1

69|1

73|1

66|1

67

9.8

-7|9

.5-7|9

.5-7|9

.9-7|9

.6-7

-4.0

-8|-

4.5

-8|4

.3-8|3

.9-8|3

.5-8

1.1|0

.9|0

.9|0

.9|0

.9

theta

64375|3

00

400|3

53|3

33|6

72|3

735

9.0

-7|9

.0-7|9

.4-7|9

.9-7|9

.9-7

-1.7

-6|2

.0-6|2

.1-6|-

2.0

-10|1

.5-9

4|3

.6|3

.3|6

.7|3

5.9

theta

62

13390|3

00

186|1

85|1

73|1

63|2

01

9.5

-7|9

.2-7|9

.6-7|9

.6-7|9

.7-7

5.4

-8|1

.4-7|1

.4-7|5

.8-9|-

6.4

-10

2|2|2

.1|1

.8|2

.1

theta

87905|4

00

453|3

80|3

57|6

44|3

401

9.4

-7|8

.1-7|9

.7-7|9

.9-7|8

.8-7

-6.5

-7|1

.7-6|2

.0-6|-

2.1

-10|1

.7-9

9.4|7

.9|7

.4|1

3.3|1

:10

theta

82

23872|4

00

191|1

71|1

72|1

81|1

82

9.1

-7|9

.7-7|9

.2-7|9

.5-7|9

.8-7

-1.5

-9|1

.8-7|6

.8-8|7

.1-8|7

.2-8

4.3|3

.8|3

.8|4|4

theta

83

39862|4

00

173|1

61|1

50|1

47|1

47

9.9

-7|9

.5-7|9

.8-7|9

.7-7|9

.4-7

-6.2

-8|1

.1-7|1

.1-7|1

.1-7|1

.1-7

3.9|3

.7|3

.4|3

.4|3

.3

theta

10

12470|5

00

464|3

86|3

89|6

86|2

801

8.5

-7|8

.9-7|8

.2-7|9

.9-7|9

.4-7

-1.4

-6|2

.0-6|1

.7-6|-

1.2

-10|2

.9-9

14.6|1

2.2|1

2.3|2

1.5|1

:27

theta

102

37467|5

00

197|1

85|1

71|1

72|2

51

9.5

-7|9

.8-7|9

.3-7|9

.9-7|8

.5-7

6.9

-8|1

.1-7|4

.6-8|5

.8-8|-

2.2

-10

6.5|6

.1|5

.7|5

.7|8

.2

theta

103

62516|5

00

154|1

44|1

41|1

45|1

44

9.8

-7|9

.0-7|9

.3-7|9

.3-7|9

.7-7

1.3

-7|6

.9-8|4

.6-8|3

.5-8|3

.6-8

5.2|4

.9|4

.8|4

.9|4

.9

theta

104

87245|5

00

159|1

48|1

55|1

52|1

51

9.8

-7|9

.5-7|9

.7-7|9

.2-7|9

.7-7

1.8

-7|1

.1-7|4

.9-8|4

.3-8|4

.4-8

5.5|5

.2|5

.4|5

.2|5

.2

theta

12

17979|6

00

458|3

92|3

82|6

01|3

101

9.5

-7|8

.2-7|9

.6-7|9

.3-7|8

.4-7

1.6

-6|1

.8-6|-

1.8

-6|4

.6-1

0|-

4.9

-10

24.3|2

0.7|2

0.2|3

1.5|2

:41

theta

123

90020|6

00

150|1

44|1

36|1

42|1

42

9.5

-7|9

.6-7|9

.9-7|9

.4-7|9

.2-7

6.3

-8|6

.6-8|7

.5-8|5

.8-8|5

.6-8

8.8|8

.4|8|8

.4|8

.2

theta

162

127600|8

00

181|1

48|1

38|1

38|1

51

9.4

-7|9

.6-7|9

.2-7|9

.9-7|9

.4-7

6.3

-8|3

.8-8|-

1.3

-8|2

.8-8|1

.1-8

20.6|1

6.8|1

5.9|1

5.5|1

7.2

MA

NN

-a27

703|3

78

1516|1

388|1

372|1

272|2

112

9.1

-7|9

.1-7|7

.5-7|8

.2-7|9

.9-7

1.3

-5|-

1.4

-5|1

.1-5|-

1.3

-5|8

.0-9

22.3|2

0.3|1

9.8|1

8.5|3

4.2

johnso

n8-4

-4561|7

0165|1

61|1

53|1

62|1

62

4.7

-7|8

.8-7|9

.6-7|9

.3-7|6

.2-7

2.3

-6|-

2.4

-6|-

5.5

-6|-

3.4

-6|-

3.4

-60.3|0

.1|0

.1|0

.2|0

.2

johnso

n16-2

-41681|1

20

102|1

03|1

04|1

07|1

07

2.1

-7|5

.0-7|1

.8-7|1

.1-7|3

.5-7

-2.7

-6|-

5.0

-6|2

.4-6|-

1.4

-6|-

1.2

-60.2|0

.2|0

.1|0

.1|0

.2

san200-0

.7-1

5971|2

00

3778|3

898|3

397|3

446|5

301

9.7

-7|9

.7-7|9

.9-7|9

.9-7|9

.7-7

-1.3

-5|-

7.6

-6|-

1.3

-5|-

1.1

-5|-

1.5

-710.5|1

1.6|8

.9|8

.7|1

3.2

sanr2

00-0

.76033|2

00

211|1

78|1

80|1

69|1

69

9.9

-7|9

.7-7|9

.4-7|9

.6-7|9

.6-7

-9.8

-8|-

1.8

-7|-

1.6

-7|-

1.6

-7|-

1.5

-71|0

.8|0

.9|0

.8|0

.8

c-f

at2

00-1

18367|2

00

239|3

70|3

68|5

68|2

701

9.9

-7|7

.7-7|8

.9-7|9

.9-7|9

.8-7

3.8

-6|-

6.4

-6|-

7.3

-6|2

.2-7|-

1.1

-61.1|1

.7|1

.6|2

.5|1

1.3

ham

min

g-6

-41313|6

473|6

8|6

4|6

4|6

46.3

-7|9

.0-7|7

.0-7|7

.0-7|7

.0-7

1.7

-6|-

2.5

-6|3

.7-6|3

.3-6|3

.3-6

0.1|0

.1|0

.1|0

.1|0

.1

ham

min

g-8

-411777|2

56

149|1

44|1

43|1

46|1

46

9.2

-7|5

.5-7|6

.8-7|2

.2-7|8

.1-7

-4.0

-6|-

5.8

-6|3

.1-6|-

2.9

-6|-

2.0

-60.9|0

.8|0

.8|0

.8|0

.8

ham

min

g-9

-82305|5

12

2608|2

666|2

533|2

758|2

656

8.3

-7|9

.0-7|9

.8-7|8

.7-7|9

.3-7

9.9

-6|1

.1-5|-

1.2

-5|1

.0-5|1

.2-5

1:1

7|1

:19|1

:16|1

:25|1

:31

ham

min

g-1

0-2

23041|1

024

700|6

44|6

30|6

31|6

30

8.3

-7|4

.5-7|7

.8-7|7

.2-7|9

.6-7

-7.6

-6|1

.2-5|1

.9-5|-

1.9

-5|-

1.9

-52:1

6|2

:01|2

:02|1

:55|1

:55

ham

min

g-7

-5-6

1793|1

28

532|5

24|5

37|5

39|5

37

6.0

-7|9

.6-7|6

.7-7|9

.7-7|9

.3-7

3.5

-6|-

5.4

-6|-

3.6

-6|3

.7-6|4

.8-6

0.8|0

.8|0

.9|0

.8|0

.9

ham

min

g-8

-3-4

16129|2

56

193|1

73|1

89|1

86|1

86

6.6

-7|8

.1-7|3

.9-7|6

.0-7|5

.1-7

-2.5

-6|6

.1-6|3

.6-6|5

.0-6|4

.2-6

1.2|1

.1|1

.2|1

.2|1

.2

ham

min

g-9

-5-6

53761|5

12

1029|1

068|1

054|1

036|1

065

7.4

-7|7

.0-7|8

.3-7|7

.5-7|8

.2-7

8.8

-6|-

7.9

-6|9

.6-6|8

.6-6|1

.0-5

33.7|3

4.5|3

4.1|3

3.6|3

4.7

bro

ck200-1

5067|2

00

241|1

96|1

86|2

02|2

51

9.4

-7|9

.4-7|9

.1-7|9

.9-7|9

.7-7

-2.2

-7|1

.1-7|1

.2-7|1

.3-7|-

1.4

-91.2|0

.9|0

.9|1|1

.2

bro

ck200-4

6812|2

00

202|1

73|1

67|1

60|1

60

9.8

-7|9

.6-7|9

.5-7|9

.8-7|9

.9-7

-4.7

-8|8

.9-8|9

.9-8|7

.5-8|6

.9-8

1|0

.8|0

.8|0

.8|0

.8

bro

ck400-1

20078|4

00

250|2

19|1

89|2

25|4

51

9.9

-7|9

.9-7|9

.9-7|9

.9-7|8

.7-7

5.0

-8|6

.9-8|-

3.8

-7|1

.2-8|9

.4-1

05.6|4

.8|4

.3|5|1

0.1

keller4

5101|1

71

259|2

11|2

03|1

88|1

88

9.9

-7|9

.8-7|9

.4-7|9

.5-7|9

.7-7

-4.7

-7|-

3.0

-7|-

3.2

-7|-

2.3

-7|-

2.3

-70.7|0

.6|0

.6|0

.5|0

.5

p-h

at3

00-1

33918|3

00

834|5

91|6

23|6

04|6

41

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.5

-7|-

2.6

-7|1

.1-7|7

.1-8|-

4.0

-98.8|6

.3|6

.6|6

.3|6

.8

G43

9991|1

000

1527|1

320|1

309|1

310|3

501

9.3

-7|9

.4-7|9

.1-7|8

.8-7|9

.4-7

2.8

-6|3

.2-6|-

3.1

-6|2

.9-6|5

.2-9

4:1

3|3

:34|3

:33|3

:37|9

:35

G44

9991|1

000

1527|1

319|1

309|1

310|3

513

8.8

-7|9

.9-7|8

.8-7|8

.3-7|9

.9-7

3.0

-6|3

.4-6|-

3.2

-6|2

.9-6|6

.4-9

4:1

0|3

:34|3

:33|3

:36|9

:36

G45

9991|1

000

1562|1

318|1

331|1

309|4

001

9.2

-7|9

.1-7|9

.4-7|8

.8-7|9

.4-7

-2.4

-6|3

.6-6|2

.8-6|3

.5-6|2

.5-9

4:1

8|3

:37|3

:39|3

:35|1

0:5

5

G46

9991|1

000

1593|1

342|1

353|1

307|3

601

9.9

-7|8

.9-7|8

.7-7|9

.2-7|8

.0-7

2.5

-6|-

3.4

-6|-

2.7

-6|3

.5-6|4

.8-9

4:2

4|3

:41|3

:43|3

:36|9

:51

G47

9991|1

000

1568|1

323|1

312|1

313|3

502

9.0

-7|9

.2-7|8

.8-7|8

.2-7|9

.9-7

-2.1

-6|2

.7-6|-

2.9

-6|2

.7-6|-

5.8

-94:2

0|3

:38|3

:36|3

:36|9

:38

G51

5910|1

000

7395|6

135|5

035|4

977|7

827

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.9

-7|-

1.4

-7|-

5.4

-8|-

2.2

-7|-

1.1

-623:3

7|2

0:2

1|1

5:3

6|1

5:2

0|2

3:3

1

G52

5917|1

000

26218|2

0278|1

6964|1

5980|1

5793

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-9.4

-6|-

9.7

-6|-

8.9

-6|-

8.9

-6|-

9.1

-61:1

3:4

8|5

7:0

9|4

7:4

5|4

5:1

8|4

4:2

4

G53

5915|1

000

100000|1

6476|1

7439|1

6424|1

7300

1.5

-6|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.0

-6|-

8.4

-6|-

7.8

-6|-

7.9

-6|-

7.8

-64:5

7:5

6|4

4:2

9|4

4:5

4|4

2:2

0|4

4:3

4

G54

5917|1

000

4715|4

592|4

152|3

806|4

532

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.3

-5|7

.2-6|-

9.8

-6|-

3.8

-6|7

.8-6

13:1

0|1

2:4

9|1

1:4

0|1

0:3

4|1

2:2

6

39

Page 40: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

3(c

onti

nued)

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

1dc.6

4544|6

44046|2

805|2

559|2

492|3

679

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-9.4

-8|3

.3-6|2

.6-7|6

.6-7|-

3.3

-84.2|2

.4|2

.1|2

.1|2

.9

1et.

64

265|6

4247|2

11|2

07|2

15|3

51

9.8

-7|9

.8-7|9

.6-7|9

.2-7|9

.5-7

-2.3

-6|2

.1-6|-

2.1

-6|1

.2-6|-

3.9

-80.2|0

.2|0

.2|0

.2|0

.3

1tc

.64

193|6

4365|3

42|3

20|7

45|5

395

8.7

-7|7

.6-7|8

.7-7|9

.9-7|9

.9-7

2.4

-6|-

2.1

-6|-

2.3

-6|-

1.6

-8|-

9.9

-80.4|0

.3|0

.3|0

.6|4

.1

1dc.1

28

1472|1

28

100000|1

00000|1

0960|1

00000|1

00000

2.1

-6|2

.8-6|9

.9-7|2

.3-6|4

.7-6

-6.3

-6|-

6.1

-7|-

5.2

-6|-

2.5

-6|-

1.9

-63:1

9|3

:32|2

0.9|3

:26|3

:37

1et.

128

673|1

28

388|3

83|3

72|7

01|3

201

8.3

-7|7

.8-7|8

.9-7|8

.8-7|9

.3-7

-3.7

-6|-

2.5

-6|2

.7-6|1

.3-8|1

.5-7

0.8|0

.7|0

.7|1

.3|5

.4

1tc

.128

513|1

28

1079|9

51|8

83|1

005|4

056

9.8

-7|8

.3-7|8

.7-7|9

.9-7|9

.9-7

-4.4

-6|4

.0-6|3

.9-6|2

.0-6|-

6.9

-81.7|1

.5|1

.4|1

.5|5

.5

1zc.1

28

1121|1

28

245|1

97|1

84|7

51|5

101

8.8

-7|9

.7-7|6

.1-7|9

.9-7|8

.0-7

9.1

-7|1

.1-6|1

.8-6|-

3.3

-8|-

2.2

-70.4|0

.4|0

.4|1

.3|7

.8

1dc.2

56

3840|2

56

8759|6

759|6

542|6

536|6

301

9.2

-7|9

.6-7|9

.4-7|8

.8-7|9

.3-7

-1.7

-5|-

1.5

-5|-

1.6

-5|-

1.5

-5|5

.3-6

52|4

0.9|3

8.7|3

8.7|3

5.7

1et.

256

1665|2

56

1623|1

333|1

242|1

184|4

601

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.6-7

-4.9

-7|-

3.5

-7|-

5.6

-7|-

5.4

-7|-

1.4

-69.8|8

.2|7

.7|7

.3|2

8

1tc

.256

1313|2

56

5673|3

259|3

271|3

911|9

201

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.5-7

-3.4

-6|-

3.5

-7|-

3.4

-7|-

3.1

-6|-

2.3

-644.4|2

4.8|2

5.8|3

0.6|1

:07

1zc.2

56

2817|2

56

295|3

02|2

76|7

09|4

301

9.2

-7|5

.0-7|9

.9-7|9

.9-7|8

.6-7

4.0

-6|2

.3-6|-

2.2

-6|2

.2-8|1

.9-7

1.7|1

.8|1

.6|4|2

3.5

1dc.5

12

9728|5

12

5618|4

875|3

740|4

643|6

065

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.5

-6|-

3.0

-6|-

6.1

-6|-

3.0

-6|-

3.3

-64:2

9|3

:53|2

:58|3

:43|4

:50

1et.

512

4033|5

12

2372|2

037|1

826|1

808|4

601

9.5

-7|9

.9-7|9

.4-7|9

.9-7|8

.5-7

2.4

-8|-

2.6

-7|-

3.2

-7|-

3.5

-7|2

.6-7

1:2

6|1

:15|1

:07|1

:06|3

:00

1tc

.512

3265|5

12

5340|7

138|4

948|4

858|6

972

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-4.2

-6|-

5.5

-6|-

3.9

-6|-

4.7

-6|-

3.7

-63:0

8|4

:15|3

:10|2

:51|4

:07

2dc.5

12

54896|5

12

6361|4

692|4

796|4

697|4

703

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.0

-5|-

1.4

-5|-

2.0

-5|-

2.0

-5|-

1.9

-54:1

5|3

:06|3

:13|2

:50|2

:51

1zc.5

12

6913|5

12

512|4

62|4

71|4

53|1

067

8.2

-7|8

.8-7|9

.2-7|9

.9-7|9

.9-7

3.0

-6|3

.1-6|-

2.6

-6|3

.8-6|5

.4-8

16.7|1

5.8|1

5.2|1

4.4|3

4.1

1dc.1

024

24064|1

024

5332|3

406|3

428|3

291|4

731

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.0

-6|-

6.9

-6|-

6.2

-6|-

6.5

-6|-

5.7

-616:4

8|1

1:3

5|1

1:2

7|1

1:1

8|1

6:2

4

1et.

1024

9601|1

024

2854|2

502|2

194|2

474|5

005

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-4.2

-6|-

4.2

-6|-

5.3

-6|-

3.1

-6|-

6.0

-68:5

6|7

:05|6

:16|7

:39|1

5:2

9

1tc

.1024

7937|1

024

6057|1

0345|1

0261|9

771|6

149

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-4.3

-6|-

4.3

-6|-

4.4

-6|-

4.4

-6|-

4.0

-620:4

6|3

6:1

6|3

6:0

9|3

2:4

0|2

1:0

6

1zc.1

024

16641|1

024

816|7

51|7

67|1

001|3

301

9.3

-7|9

.1-7|9

.4-7|8

.4-7|9

.9-7

5.9

-6|4

.2-6|4

.6-6|-

7.5

-8|-

1.5

-62:2

6|2

:15|2

:17|2

:58|9

:47

2dc.1

024

169163|1

024

10074|7

629|7

491|7

893|7

625

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.5

-5|-

2.5

-5|-

2.6

-5|-

2.6

-5|-

2.5

-533:1

5|2

4:5

1|2

4:4

0|2

6:0

4|2

4:4

9

1dc.2

048

58368|2

048

6436|6

115|4

962|5

898|4

801

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

5.8

-7|-

5.4

-6|-

1.1

-5|-

5.5

-6|-

8.1

-62:3

5:4

0|

2:3

6:5

0|

2:0

2:2

0|

2:5

1:3

9|

2:0

9:4

2

1et.

2048

22529|2

048

4697|7

488|4

501|1

0578|5

301

9.9

-7|9

.9-7|9

.9-7|9

.9-7|8

.7-7

-4.7

-6|-

5.3

-6|-

9.9

-6|-

5.3

-6|-

4.1

-62:1

0:3

4|

3:3

4:0

6|

1:5

5:3

2|

4:3

2:2

1|

2:2

7:1

0

1tc

.2048

18945|2

048

6623|5

975|7

106|1

2207|5

769

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-6.9

-6|-

6.5

-6|-

8.7

-6|-

6.0

-6|-

7.0

-63:0

6:5

7|

2:4

2:5

4|

2:5

7:3

9|

5:1

9:1

3|

2:2

2:2

2

1zc.2

048

39425|2

048

1530|1

395|1

391|1

408|4

201

7.5

-7|9

.9-7|9

.7-7|9

.9-7|8

.4-7

4.3

-6|5

.4-6|4

.9-6|5

.3-6|2

.2-8

34:0

8|3

5:0

7|3

2:3

7|3

3:0

2|

1:3

9:3

1

2dc.2

048

504452|2

048

6561|5

134|4

800|4

745|5

472

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.5

-5|-

2.4

-5|-

2.5

-5|-

2.5

-5|-

2.3

-52:5

7:3

7|

2:0

9:2

1|

2:0

9:4

3|

1:4

7:4

9|

2:0

3:1

9

fap01

1378|5

21839|1

449|1

540|1

419|1

417

9.6

-7|9

.5-7|9

.9-7|9

.9-7|9

.7-7

1.5

-6|-

1.4

-5|-

1.8

-5|-

1.8

-5|-

1.7

-52.4|1

.3|1

.4|1

.3|1

.3

fap02

1866|6

15171|4

199|3

770|5

420|4

257

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

-5.0

-6|5

.2-7|4

.6-6|-

1.0

-5|5

.0-6

4|3

.2|2

.9|4

.3|3

.3

fap03

2145|6

52600|2

443|2

080|2

351|2

345

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7

-4.1

-6|-

1.3

-5|-

1.5

-5|-

1.3

-5|-

1.3

-52.3|2

.2|1

.8|2

.1|2

.2

fap04

3321|8

12174|1

751|1

601|1

641|1

626

9.9

-7|8

.9-7|8

.1-7|9

.8-7|9

.9-7

-5.6

-6|5

.3-6|1

.1-6|1

.1-5|4

.8-6

3.1|2

.7|2

.2|2

.4|2

.2

fap05

3570|8

42457|1

928|1

753|1

739|1

736

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.1

-6|9

.9-6|9

.9-6|-

5.0

-6|3

.9-6

3.5|2

.9|2

.5|2

.4|2

.5

fap06

4371|9

31668|1

317|1

223|1

174|1

172

9.9

-7|9

.4-7|9

.9-7|9

.9-7|9

.9-7

-9.5

-7|-

1.5

-7|-

2.3

-6|6

.1-6|6

.2-6

2.9|2

.2|1

.9|1

.9|1

.9

fap07

4851|9

81873|1

558|1

418|1

392|1

389

9.9

-7|9

.1-7|8

.4-7|9

.3-7|9

.4-7

1.5

-6|1

.2-6|2

.5-6|-

7.4

-7|-

7.9

-72.9|2

.4|2

.3|2

.3|2

.4

fap08

7260|1

20

1617|1

293|1

198|1

201|1

201

9.9

-7|9

.9-7|9

.9-7|9

.6-7|8

.4-7

-1.7

-6|-

1.7

-6|-

1.8

-6|-

6.8

-6|1

.7-6

3.6|2

.6|2

.4|2

.5|2

.5

fap09

15225|1

74

951|7

58|7

01|7

18|7

16

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-6|-

1.1

-6|-

8.3

-7|-

1.1

-6|-

1.1

-63.4|2

.8|2

.9|2

.8|3

fap10

14479|1

83

5156|4

038|3

838|3

568|3

470

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-6.6

-5|-

9.0

-5|-

7.5

-5|-

5.8

-5|-

5.8

-526.2|2

0.5|1

9.3|1

8.2|1

7.7

fap11

24292|2

52

5428|4

358|4

062|3

978|3

965

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-4|-1.1

-4|-1.2

-4|-1.2

-4|-1.2

-450.8|4

1.9|4

2|3

5.7|3

5.5

fap12

26462|3

69

7414|6

192|5

716|5

501|5

808

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.0

-4|-1.3

-4|-1.3

-4|-1.2

-4|-1.2

-42:0

6|1

:44|1

:36|1

:33|1

:38

fap25

322924|2

118

11830|9

292|8

504|8

409|1

1289

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.0

-5|-

8.2

-5|-

9.1

-5|-

8.1

-5|-

8.0

-55:4

7:2

1|

3:3

2:4

9|

3:0

5:4

2|

3:2

6:0

9|

4:5

0:1

0

fap36

1154467|4

110

8085|6

495|5

929|5

980|9

001

9.9

-7|1

.0-6|1

.0-6|9

.9-7|9

.7-7

-2.8

-5|-

2.8

-5|-

2.8

-5|-

2.8

-5|-

2.4

-521:3

6:2

6|

17:0

7:3

9|

16:0

5:0

0|

15:5

4:3

1|

25:0

5:5

8

Rn3m

20p3

20000|3

00

644|5

27|4

86|4

77|4

76

9.9

-7|9

.9-7|9

.9-7|9

.7-7|9

.7-7

-8.6

-6|-

8.3

-6|-

8.2

-6|-

8.0

-6|-

8.0

-619.4|1

5.7|1

3.4|1

5.1|1

6.2

Rn3m

25p3

25000|3

00

561|4

73|4

42|4

30|4

29

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.1

-5|-

2.8

-5|-

2.8

-5|-

2.8

-5|-

2.7

-545.5|3

8.5|2

9.4|3

2.6|3

2.4

Rn3m

10p4

10000|3

00

1076|8

24|8

16|9

04|5

602

9.9

-7|9

.9-7|9

.9-7|9

.9-7|7

.0-7

-2.6

-5|-

2.6

-5|-

2.6

-5|-

2.6

-5|-

1.7

-645.9|3

9.2|3

0.1|4

3.5|4

:02

40

Page 41: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

3(c

onti

nued)

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

Rn4m

30p3

30000|4

00

781|6

05|5

74|5

61|6

41

9.9

-7|9

.9-7|9

.8-7|9

.9-7|6

.5-7

-2.0

-5|-

1.9

-5|-

1.9

-5|-

1.9

-5|-

1.0

-535.4|2

5.9|2

0.9|2

3.3|2

8.4

Rn4m

40p3

40000|4

00

645|5

02|5

14|5

03|5

02

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.5

-6|-

7.8

-6|-

7.9

-6|-

7.9

-6|-

7.9

-61:3

3|1

:08|1

:02|1

:16|1

:14

Rn4m

15p4

15000|4

00

1193|9

93|9

26|9

98|6

002

9.9

-7|9

.9-7|9

.9-7|9

.8-7|7

.7-7

-2.2

-5|-

2.2

-5|-

2.2

-5|-

2.2

-5|-

9.8

-61:3

4|1

:19|5

9.2|1

:18|7

:32

Rn5m

30p3

30000|5

00

1027|8

32|7

47|8

26|5

416

9.9

-7|9

.9-7|9

.9-7|9

.9-7|8

.4-7

-1.2

-3|-1.2

-3|-1.2

-3|-1.2

-3|9

.7-5

42.2|3

4.3|2

3.4|3

2.2|3

:44

Rn5m

40p3

40000|5

00

918|6

89|6

51|6

58|4

103

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-4|-1.0

-4|-1.0

-4|-1.0

-4|-

9.7

-650.2|4

1.5|2

7.3|3

5.9|3

:41

Rn5m

50p3

50000|5

00

776|6

02|5

72|5

59|5

58

9.9

-7|9

.9-7|9

.7-7|9

.9-7|9

.8-7

-2.1

-5|-

2.0

-5|-

2.0

-5|-

2.0

-5|-

2.0

-51:2

7|1

:11|5

3.2|1

:06|1

:06

Rn5m

20p4

20000|5

00

1257|1

021|1

003|1

226|6

114

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.1-7

-1.7

-5|-

1.6

-5|-

1.7

-5|-

1.7

-5|-

1.2

-52:3

6|1

:51|1

:29|2

:08|1

1:0

1

Rn6m

40p3

40000|6

00

1051|8

27|8

17|9

27|5

701

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.7-7

-3.5

-5|-

3.4

-5|-

3.5

-5|-

3.5

-5|-

1.3

-51:0

1|4

7.8|4

1.4|5

4.5|5

:40

Rn6m

50p3

50000|6

00

1018|7

87|7

15|7

47|4

802

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-5|-

1.4

-5|-

1.4

-5|-

1.4

-5|1

.1-6

1:1

0|5

2.2|4

0.4|5

2|5

:36

Rn6m

60p3

60000|6

00

839|6

89|6

39|6

33|3

637

9.9

-7|9

.8-7|9

.8-7|9

.8-7|2

.1-7

-2.3

-5|-

2.4

-5|-

2.4

-5|-

2.4

-5|4

.6-6

1:2

5|1

:12|5

3.3|1

:04|5

:48

Rn6m

20p4

20000|6

00

1414|1

303|1

131|1

343|6

478

9.9

-7|4

.8-7|9

.9-7|9

.9-7|9

.9-7

-2.3

-5|-

9.9

-6|-

2.3

-5|-

2.3

-5|-

2.0

-52:0

2|2

:00|1

:21|1

:54|8

:53

Rn7m

50p3

50000|7

00

1126|9

30|8

94|9

45|5

709

9.9

-7|9

.9-7|6

.0-7|9

.9-7|9

.3-7

-3.5

-5|-

3.5

-5|-

1.0

-5|-

3.5

-5|-

1.2

-51:3

7|1

:20|1

:11|1

:21|8

:12

Rn7m

70p3

70000|7

00

1010|7

50|7

34|7

40|4

648

9.9

-7|9

.8-7|9

.9-7|9

.9-7|2

.9-7

-6.1

-5|-

5.9

-5|-

6.0

-5|-

6.0

-5|1

.1-6

1:4

2|1

:16|1

:04|1

:15|7

:49

Rn8m

70p3

70000|8

00

1131|9

35|9

01|9

36|5

493

9.9

-7|9

.9-7|9

.7-7|9

.9-7|4

.9-7

-2.0

-5|-

2.0

-5|-

5.7

-6|-

2.0

-5|-

5.7

-62:1

2|1

:40|1

:41|1

:40|1

3:0

6

Rn8m

100p3

100000|8

00

949|7

44|7

05|6

91|4

003

9.9

-7|9

.8-7|9

.9-7|9

.9-7|1

.9-7

-4.7

-5|-

4.6

-5|-

4.6

-5|-

4.6

-5|6

.4-6

3:0

7|2

:20|1

:50|2

:09|1

1:4

4

be100.1

101|1

01

2694|2

192|1

989|1

810|5

458

9.9

-7|9

.9-7|9

.5-7|9

.9-7|9

.9-7

-9.3

-7|-

9.3

-7|-

1.3

-6|-

3.4

-6|5

.0-7

4|2

.6|3|2

.4|6

.4

be100.2

101|1

01

2211|1

697|1

738|1

707|4

701

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.2-7

-9.9

-7|-

1.8

-7|-

6.6

-7|-

7.1

-7|-

4.0

-72.8|2

.1|2

.2|2

.1|5

.8

be100.3

101|1

01

2615|2

048|2

033|2

037|4

751

9.4

-7|9

.7-7|9

.1-7|9

.6-7|9

.6-7

1.3

-6|1

.5-6|-

1.5

-6|-

3.6

-6|-

2.9

-73.5|2

.8|2

.7|2

.7|5

.5

be100.4

101|1

01

2223|1

854|1

854|1

787|5

491

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.9

-7|-

9.7

-7|-

5.3

-7|-

4.6

-7|-

5.3

-72.8|2

.4|2

.4|2

.2|6

.7

be100.5

101|1

01

1872|1

421|1

520|1

429|5

383

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.2

-7|-

5.2

-7|-

7.5

-7|-

7.1

-7|-

1.4

-72.4|1

.8|2|1

.9|6

.5

be100.6

101|1

01

2329|1

881|1

949|1

873|4

302

9.2

-7|9

.9-7|9

.9-7|9

.7-7|7

.2-7

-1.5

-7|-

4.5

-7|-

4.9

-7|-

9.4

-7|-

2.1

-72.9|2

.2|2

.5|2

.3|5

.3

be100.7

101|1

01

2201|1

947|1

637|1

739|5

501

9.9

-7|9

.9-7|9

.4-7|9

.9-7|9

.9-7

-8.4

-7|-

3.7

-7|-

4.1

-7|-

2.1

-7|-

5.4

-72.5|2

.2|2

.4|2|6

.1

be100.8

101|1

01

2254|1

929|1

797|1

601|4

165

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.8

-7|-

3.8

-7|-

1.7

-7|-

4.9

-7|-

4.7

-72.6|2

.3|2

.3|1

.9|4

.7

be100.9

101|1

01

1755|1

319|1

258|1

192|5

121

9.9

-7|8

.9-7|9

.9-7|9

.9-7|9

.9-7

-6.8

-7|-

2.7

-7|-

9.3

-7|-

6.2

-7|4

.2-8

2|1

.7|1

.7|1

.4|5

.7

be100.1

0101|1

01

1755|1

402|1

321|1

259|5

401

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.8

-7|-

4.4

-7|-

4.8

-7|-

3.3

-7|-

2.7

-72.1|1

.8|1

.7|1

.5|6

be120.3

.1121|1

21

2446|2

122|1

830|1

845|5

201

9.9

-7|9

.9-7|9

.3-7|9

.9-7|9

.3-7

2.4

-7|-

6.7

-7|-

2.5

-7|-

10.0

-7|-

1.7

-73.5|2

.9|3

.1|2

.6|7

.2

be120.3

.2121|1

21

2486|1

729|1

601|1

581|5

201

9.9

-7|9

.3-7|9

.2-7|8

.2-7|8

.0-7

-1.4

-6|-

1.1

-6|-

2.2

-6|-

2.3

-6|-

3.1

-83.5|2

.6|2

.6|2

.4|7

.1

be120.3

.3121|1

21

2465|2

144|1

938|1

839|6

319

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.3

-7|-

7.1

-7|-

4.3

-7|-

5.0

-7|-

2.0

-83.5|3

.1|3

.1|2

.8|8

.9

be120.3

.4121|1

21

2921|2

350|2

050|2

032|5

188

9.8

-7|9

.9-7|9

.9-7|9

.9-7|7

.2-7

-5.8

-7|-

4.8

-7|-

2.3

-7|-

6.0

-7|9

.7-8

4.3|3

.3|3

.1|2

.8|7

.3

be120.3

.5121|1

21

2042|1

839|1

699|1

619|5

111

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-6|3

.1-7|2

.6-7|5

.5-7|4

.2-8

2.9|2

.6|2

.6|2

.3|7

be120.3

.6121|1

21

2743|2

135|1

849|1

940|5

957

9.9

-7|9

.9-7|9

.5-7|9

.9-7|9

.9-7

-8.4

-7|-

5.2

-7|-

9.3

-7|-

4.6

-7|-

2.3

-73.6|3|2

.9|2

.6|7

.8

be120.3

.7121|1

21

3229|2

410|2

278|2

248|6

002

9.9

-7|9

.9-7|9

.9-7|9

.9-7|5

.2-7

-9.0

-7|-

7.6

-7|-

4.4

-7|-

5.1

-7|7

.9-8

4.5|3

.4|3

.4|3

.1|8

.2

be120.3

.8121|1

21

2934|2

418|2

166|2

035|5

201

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.3-7

-3.2

-7|-

5.5

-7|-

2.9

-7|-

4.7

-8|-

4.6

-74.3|3

.5|3

.3|3|7

.5

be120.3

.9121|1

21

2133|1

623|1

593|1

564|5

002

9.9

-7|9

.9-7|9

.9-7|9

.9-7|7

.7-7

-9.0

-7|-

9.5

-7|-

8.7

-7|-

9.2

-7|8

.0-8

3|2

.3|2

.4|2

.2|7

be120.3

.10

121|1

21

2355|1

826|1

558|1

632|6

513

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-6|-

1.1

-6|-

1.5

-6|-

9.4

-7|5

.4-7

3.3|2

.6|2

.4|2

.4|8

.9

be120.8

.1121|1

21

2555|1

865|1

974|1

729|4

966

9.9

-7|9

.9-7|9

.9-7|9

.9-7|6

.1-7

-1.4

-8|4

.8-7|4

.5-8|6

.5-7|-

7.8

-83.5|2

.5|3

.2|2

.3|6

.5

be120.8

.2121|1

21

2127|1

768|1

557|1

743|4

719

8.7

-7|9

.9-7|8

.8-7|9

.9-7|9

.9-7

-1.2

-6|-

4.7

-7|-

1.6

-6|-

9.2

-7|4

.3-7

3.2|2

.5|2

.6|2

.5|6

.4

be120.8

.3121|1

21

2267|1

722|1

723|1

658|5

202

9.9

-7|9

.9-7|9

.9-7|9

.9-7|7

.8-7

-5.3

-7|-

1.7

-7|-

9.9

-8|-

5.3

-7|1

.4-7

3.3|2

.5|2

.7|2

.4|7

.5

be120.8

.4121|1

21

2438|1

770|1

810|1

705|5

537

9.9

-7|9

.9-7|9

.9-7|9

.9-7|5

.5-7

-6.2

-7|-

3.3

-6|-

8.3

-7|2

.5-7|-

2.0

-73.5|2

.5|2

.8|2

.4|7

.5

be120.8

.5121|1

21

2657|1

879|1

835|1

859|5

306

9.9

-7|9

.2-7|9

.8-7|8

.6-7|9

.5-7

-2.3

-7|1

.1-6|4

.0-6|-

2.0

-6|-

4.1

-73.5|2

.7|2

.8|2

.7|7

.1

be120.8

.6121|1

21

2023|1

653|1

547|1

525|5

598

9.9

-7|9

.9-7|9

.9-7|9

.9-7|8

.9-7

-1.7

-6|-

1.2

-6|-

1.3

-6|-

1.3

-6|4

.4-8

2.8|2

.3|2

.3|2

.1|7

.7

be120.8

.7121|1

21

2342|1

809|1

894|1

792|5

833

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.3-7

-2.8

-7|-

3.4

-7|-

1.5

-8|-

8.0

-8|-

3.5

-73.3|2

.6|2

.8|2

.5|8

.2

be120.8

.8121|1

21

2260|1

793|1

685|1

500|4

935

9.9

-7|9

.9-7|9

.9-7|9

.9-7|6

.5-7

4.2

-8|8

.2-8|2

.1-7|3

.0-7|5

.4-9

3.4|2

.6|2

.7|2

.2|7

.1

41

Page 42: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

3(c

onti

nued)

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

be120.8

.9121|1

21

1988|1

763|1

631|1

505|6

299

9.6

-7|9

.9-7|9

.9-7|9

.9-7|9

.6-7

-1.0

-6|-

1.0

-6|-

8.7

-7|-

1.5

-6|5

.8-8

3|2

.7|2

.9|2

.2|9

be120.8

.10

121|1

21

2227|1

900|1

624|1

636|5

701

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.1-7

-8.1

-7|-

8.5

-7|-

1.0

-6|-

7.6

-7|-

4.9

-73.1|2

.7|2

.6|2

.3|7

.8

be150.3

.1151|1

51

3270|2

533|2

282|2

210|5

095

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.2

-7|-

7.8

-7|-

8.5

-7|-

1.1

-6|2

.7-7

6.7|5

.1|5

.1|4

.5|1

0.2

be150.3

.2151|1

51

3355|2

724|2

472|2

346|4

801

9.9

-7|9

.9-7|9

.9-7|9

.9-7|8

.6-7

-7.1

-7|-

8.4

-7|-

8.2

-7|-

5.1

-7|-

2.6

-76.8|5

.5|5

.2|4

.8|9

.5

be150.3

.3151|1

51

3189|2

306|2

360|1

982|5

601

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.6-7

-1.0

-6|-

7.8

-7|-

8.3

-7|-

1.7

-6|5

.1-7

6.7|4

.8|5

.3|4

.2|1

1.5

be150.3

.4151|1

51

3794|3

040|2

672|2

558|5

307

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.6-7

-5.7

-7|-

4.9

-7|-

1.8

-7|-

5.6

-7|3

.8-7

7.6|6

.1|5

.7|5

.1|1

0.8

be150.3

.5151|1

51

2797|2

352|2

069|2

117|5

320

9.9

-7|9

.9-7|9

.9-7|9

.9-7|5

.7-7

-2.3

-7|-

5.1

-7|-

7.0

-7|-

4.1

-7|-

1.1

-85.7|4

.7|4

.5|4

.3|1

0.5

be150.3

.6151|1

51

2758|2

365|2

068|2

120|5

532

9.9

-7|9

.9-7|9

.9-7|9

.9-7|7

.1-7

-8.0

-7|-

5.9

-7|-

8.4

-7|-

5.5

-7|-

3.9

-75.6|4

.8|4

.5|4

.4|1

1

be150.3

.7151|1

51

2937|2

398|2

074|2

174|4

901

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.4

-7|-

3.5

-7|-

5.6

-7|-

4.2

-7|-

3.7

-75.6|4

.7|4

.4|4

.3|9

.2

be150.3

.8151|1

51

3001|2

215|1

979|1

871|5

901

9.9

-7|9

.9-7|9

.7-7|9

.4-7|8

.0-7

-1.2

-7|-

3.1

-7|-

6.1

-7|-

1.5

-6|-

3.1

-76|4

.5|4

.5|4|1

2

be150.3

.9151|1

51

2370|1

680|1

592|1

452|5

202

9.9

-7|9

.9-7|9

.9-7|9

.9-7|6

.4-7

-1.1

-6|-

7.7

-7|-

1.3

-6|-

1.2

-6|6

.1-8

4.9|3

.5|3

.5|3|1

0.4

be150.3

.10

151|1

51

3227|2

618|2

182|2

258|4

701

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.8

-7|-

4.1

-7|-

4.4

-7|-

2.9

-7|2

.9-7

6.8|5

.4|4

.6|4

.7|9

.4

be150.8

.1151|1

51

2717|2

203|1

962|1

834|6

302

9.9

-7|9

.9-7|9

.9-7|9

.9-7|6

.5-7

-1.1

-6|-

1.0

-6|-

10.0

-7|-

8.2

-7|-

2.2

-75.6|4

.5|4

.2|3

.8|1

2.5

be150.8

.2151|1

51

2898|2

326|2

164|2

049|5

717

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.3

-7|-

4.4

-7|-

7.1

-7|-

7.2

-7|-

5.4

-76.4|4

.9|4

.8|4

.3|1

1.7

be150.8

.3151|1

51

3367|2

815|2

644|2

528|5

401

9.8

-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

-2.8

-7|-

6.9

-8|-

1.3

-7|-

3.7

-8|3

.8-7

7.2|5

.7|5

.7|5

.1|1

0.9

be150.8

.4151|1

51

2724|2

114|2

057|1

975|6

025

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-6.5

-7|-

1.1

-6|-

7.7

-7|-

7.8

-7|-

1.1

-75.5|4

.3|4

.4|4

.1|1

2

be150.8

.5151|1

51

3125|2

505|2

150|2

043|4

796

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.6

-7|-

4.2

-7|6

.0-7|2

.9-7|1

.8-7

6.3|5

.1|4

.6|4

.1|9

.5

be150.8

.6151|1

51

2329|2

148|1

819|1

926|5

301

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.1-7

-3.2

-6|-

9.0

-7|-

1.9

-6|-

7.8

-7|1

.3-7

4.6|4

.2|3

.7|3

.8|1

0

be150.8

.7151|1

51

3687|2

439|2

315|2

358|6

244

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.6-7

-3.6

-7|-

4.6

-7|-

1.1

-6|-

2.7

-7|-

2.1

-77.3|4

.8|5|4

.6|1

2.5

be150.8

.8151|1

51

3310|2

566|2

415|2

134|4

601

9.9

-7|9

.9-7|9

.9-7|9

.3-7|9

.4-7

-6.3

-7|-

5.9

-7|-

7.6

-7|-

6.7

-7|2

.2-7

6.9|5

.2|5

.3|4

.7|9

.1

be150.8

.9151|1

51

2843|1

975|1

930|1

920|4

595

9.8

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-6|-

1.4

-6|-

1.1

-6|-

1.3

-6|-

4.9

-76.1|4

.1|4

.5|4|9

.2

be150.8

.10

151|1

51

3358|2

517|2

499|2

279|4

901

9.8

-7|9

.9-7|9

.9-7|9

.9-7|9

.0-7

-6.8

-7|-

7.1

-7|-

6.9

-7|-

8.4

-7|1

.4-7

7.1|5

.2|5

.4|4

.7|1

0

be200.3

.1201|2

01

3446|3

055|2

334|2

389|6

347

9.8

-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

-6.3

-7|-

8.7

-7|-

4.8

-7|-

1.1

-6|-

4.4

-711.7|1

0.2|8

.2|7

.8|2

1

be200.3

.2201|2

01

4035|2

920|2

850|2

555|6

201

9.8

-7|9

.9-7|9

.8-7|9

.9-7|8

.3-7

-5.8

-7|-

6.5

-7|-

1.2

-6|-

9.4

-7|2

.9-7

13.1|9

.3|9

.5|8

.1|1

8.8

be200.3

.3201|2

01

4491|3

515|3

332|3

219|5

901

9.5

-7|9

.7-7|9

.9-7|9

.9-7|9

.4-7

9.5

-8|-

4.8

-7|-

8.0

-7|-

3.4

-7|6

.6-7

15.1|1

1.3|1

0.9|1

0.1|1

8.2

be200.3

.4201|2

01

4254|3

111|3

137|2

894|6

201

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.0-7

-5.5

-7|-

9.3

-7|-

8.4

-7|-

1.1

-6|2

.8-7

13.2|1

0|1

0.4|9

.3|1

9

be200.3

.5201|2

01

3918|3

122|2

608|2

369|6

001

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.3

-7|-

1.3

-6|-

1.3

-6|-

8.8

-7|1

.4-7

13|9

.8|8

.4|7

.5|1

8.5

be200.3

.6201|2

01

3493|2

393|2

744|2

642|5

601

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

3.1

-7|-

2.8

-6|-

8.0

-7|-

1.1

-6|-

2.8

-711.2|7

.8|9

.5|8

.6|1

7.7

be200.3

.7201|2

01

4573|3

489|3

174|3

069|5

851

9.9

-7|9

.9-7|9

.9-7|9

.6-7|9

.9-7

3.8

-9|7

.4-8|-

3.6

-7|-

3.3

-7|6

.0-7

14.6|1

1.3|1

0.4|1

0|1

8.5

be200.3

.8201|2

01

4148|3

211|2

992|2

906|6

301

9.7

-7|9

.9-7|9

.9-7|9

.9-7|8

.5-7

-5.3

-7|-

7.3

-7|-

1.2

-6|-

1.1

-6|2

.9-7

13.5|1

0.2|9

.5|9

.4|1

9.3

be200.3

.9201|2

01

3890|2

910|2

695|2

842|6

019

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.6

-7|-

1.1

-6|-

6.9

-7|-

9.9

-7|4

.3-7

12.3|9

.1|8

.7|9

.3|1

8.9

be200.3

.10

201|2

01

3156|2

753|2

364|2

435|5

677

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.5-7

-5.6

-7|-

4.4

-7|-

2.5

-7|-

3.6

-7|-

3.6

-79.8|8

.8|8|7

.8|1

8

be200.8

.1201|2

01

4599|3

750|3

417|3

430|5

501

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.0-7

-1.3

-6|-

9.6

-7|-

9.9

-7|-

1.0

-6|2

.1-8

14.7|1

1.9|1

1.6|1

0.9|1

7.6

be200.8

.2201|2

01

4094|3

063|3

023|2

806|5

872

9.7

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.9

-7|-

7.7

-7|-

5.5

-7|-

4.2

-7|-

4.0

-713.4|9

.6|9

.9|8

.8|1

8.1

be200.8

.3201|2

01

4257|3

025|3

035|2

895|5

701

9.9

-7|9

.9-7|9

.9-7|9

.9-7|8

.8-7

-3.9

-9|-

6.1

-7|-

7.9

-7|-

9.8

-7|9

.0-8

13.7|9

.8|1

7.2|9

.5|1

7.9

be200.8

.4201|2

01

4358|3

079|2

860|2

660|6

901

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.3-7

-8.9

-7|3

.5-7|-

5.7

-7|-

2.1

-7|-

1.6

-713.8|9

.7|1

0|8

.4|2

1.2

be200.8

.5201|2

01

3776|2

878|2

631|2

242|5

501

9.9

-7|9

.9-7|9

.9-7|9

.4-7|8

.6-7

-5.8

-7|-

7.6

-7|-

7.4

-7|-

4.5

-7|-

3.0

-712.5|9

.1|8

.5|7

.5|1

6.8

be200.8

.6201|2

01

4717|3

307|2

999|2

964|6

427

9.9

-7|9

.9-7|9

.9-7|9

.2-7|9

.9-7

-3.5

-6|-

2.8

-6|2

.4-6|1

.2-6|5

.6-7

14.2|1

0.2|9

.7|9

.8|1

9.1

be200.8

.7201|2

01

4392|3

574|2

968|3

172|5

701

9.9

-7|9

.9-7|9

.9-7|9

.9-7|7

.5-7

-3.5

-7|-

1.3

-6|-

6.2

-7|-

6.4

-7|-

4.7

-714.4|1

1.7|9

.7|1

0.3|1

8.3

be200.8

.8201|2

01

4110|3

295|2

951|2

824|6

101

9.8

-7|9

.9-7|9

.9-7|9

.9-7|8

.8-7

-2.7

-7|-

2.6

-7|-

4.5

-7|-

7.5

-7|3

.5-7

13.3|1

0.1|9

.2|8

.8|1

8.2

be200.8

.9201|2

01

4121|2

974|2

714|2

540|5

901

9.9

-7|9

.9-7|9

.9-7|9

.9-7|8

.7-7

-1.2

-6|-

6.8

-7|-

1.2

-6|-

3.8

-6|1

.1-7

13|9

.4|8

.8|8

.1|1

8.3

be200.8

.10

201|2

01

3586|2

869|2

739|2

743|4

482

9.9

-7|9

.9-7|9

.9-7|9

.8-7|9

.9-7

-5.1

-7|-

5.4

-7|-

8.9

-7|-

8.9

-7|5

.0-7

11.6|9

.3|9

.5|9

.3|1

4.5

be250.1

251|2

51

6404|4

571|4

299|4

293|7

001

9.9

-7|9

.9-7|9

.8-7|9

.7-7|9

.9-7

-3.5

-6|-

1.2

-6|-

5.4

-7|-

5.3

-7|-

8.7

-830.4|2

1.1|2

1.5|2

0.8|3

2.1

42

Page 43: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

3(c

onti

nued)

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

be250.2

251|2

51

5804|4

388|3

916|3

795|7

301

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.8-7

3.6

-6|-

8.7

-7|-

6.8

-7|-

1.1

-6|6

.8-7

27.1|2

1.8|1

8.7|1

7.5|3

3.1

be250.3

251|2

51

6255|4

411|3

231|3

150|6

976

9.9

-7|9

.9-7|9

.5-7|9

.9-7|9

.9-7

-3.2

-6|-

4.9

-7|-

1.3

-6|-

3.5

-6|4

.0-7

29.6|1

9.9|1

6.3|1

5|3

2.5

be250.4

251|2

51

6945|5

107|4

207|4

164|6

451

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-6|-

1.4

-6|-

9.4

-7|-

1.1

-6|-

2.2

-734.4|2

4.3|2

1.1|1

9.8|3

1.8

be250.5

251|2

51

4735|3

541|3

275|3

254|6

695

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.6

-8|-

1.4

-7|-

5.3

-7|-

4.5

-7|5

.5-7

22|1

6.7|1

6|1

5.4|3

0.4

be250.6

251|2

51

5335|3

888|3

521|3

562|6

437

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

6.6

-7|-

4.7

-7|-

9.3

-7|-

6.4

-7|-

3.1

-725.8|1

8.6|1

7.5|1

8|3

0.9

be250.7

251|2

51

5803|4

330|3

975|3

434|6

503

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7

-1.7

-6|-

1.1

-6|-

7.8

-7|-

1.1

-6|8

.0-7

26.7|2

0.3|1

9.6|1

6|2

9.6

be250.8

251|2

51

5543|4

225|3

790|3

653|6

364

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

2.1

-6|3

.1-7|-

7.9

-7|-

1.1

-6|3

.0-7

26.7|2

0.3|1

9.2|1

7.7|3

1.3

be250.9

251|2

51

4764|3

732|3

602|3

487|6

701

9.9

-7|9

.9-7|9

.9-7|9

.7-7|9

.5-7

-1.8

-6|-

9.7

-7|-

4.2

-7|-

6.2

-7|2

.2-7

22.1|1

7.4|1

7.3|1

7.3|3

1

be250.1

0251|2

51

5548|3

889|3

461|3

480|6

213

9.8

-7|9

.9-7|9

.9-7|9

.8-7|9

.9-7

-2.1

-6|-

2.5

-7|1

.4-7|-

4.9

-7|-

3.6

-726.6|1

8.2|1

6.8|1

6.7|2

8.5

bqp50-1

51|5

1855|7

50|6

47|6

80|5

201

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.5-7

-5.0

-7|-

2.2

-7|1

.7-7|-

4.0

-7|9

.2-8

0.5|0

.4|0

.4|0

.4|2

.7

bqp50-2

51|5

12463|2

424|2

074|1

859|4

502

9.9

-7|9

.9-7|9

.9-7|9

.9-7|6

.9-7

-4.2

-7|-

8.4

-7|-

6.5

-7|-

1.2

-6|4

.1-8

1.3|1

.2|1

.2|1

.1|2

.3

bqp50-3

51|5

13650|2

518|2

312|2

171|5

001

9.5

-7|9

.4-7|9

.9-7|9

.8-7|9

.2-7

-1.7

-6|7

.0-7|-

3.5

-6|3

.3-6|5

.2-7

1.8|1

.3|1

.3|1

.1|2

.3

bqp50-4

51|5

11828|1

436|1

650|1

384|4

501

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

2.2

-7|9

.3-8|-

9.1

-7|3

.5-7|7

.0-8

1|0

.8|0

.9|0

.7|2

.3

bqp50-5

51|5

11702|1

217|1

129|1

117|4

623

9.8

-7|9

.6-7|9

.8-7|8

.0-7|9

.9-7

-2.7

-6|-

2.9

-6|-

1.8

-6|-

1.5

-6|-

1.8

-70.8|0

.6|0

.7|0

.6|2

.1

bqp50-6

51|5

12538|1

960|1

749|1

758|4

583

9.7

-7|9

.0-7|9

.7-7|9

.3-7|8

.4-7

-1.4

-6|2

.2-6|-

3.8

-6|2

.0-6|5

.2-8

1.3|1

.3|1|1

.4|2

.9

bqp50-7

51|5

11921|1

478|1

479|1

369|5

401

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

-1.7

-7|-

3.1

-7|-

3.2

-7|-

9.2

-7|8

.2-8

1|0

.8|0

.8|0

.7|2

.7

bqp50-8

51|5

11380|1

261|1

111|1

139|6

701

9.2

-7|9

.1-7|8

.9-7|8

.9-7|9

.1-7

-9.0

-7|-

1.6

-6|2

.1-6|-

7.0

-7|5

.2-7

0.8|0

.7|0

.7|0

.7|3

.3

bqp50-9

51|5

11848|1

534|1

337|1

377|6

101

8.7

-7|9

.0-7|9

.8-7|8

.1-7|9

.5-7

1.2

-6|1

.0-6|2

.9-6|-

1.9

-6|-

1.8

-71|0

.8|0

.8|0

.8|3

.2

bqp50-1

051|5

11961|1

571|1

440|1

435|1

572

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.8

-7|-

5.7

-7|-

7.6

-7|-

3.2

-7|-

8.1

-81.1|1

.1|0

.8|0

.9|0

.9

bqp100-1

101|1

01

2003|1

705|1

517|1

645|6

401

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.5-7

-3.3

-7|5

.6-8|-

3.0

-7|1

.2-8|-

5.3

-72.5|2

.1|1

.9|2|7

.8

bqp100-2

101|1

01

2748|2

256|2

124|2

031|6

218

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.8

-7|-

9.5

-7|-

8.2

-7|-

9.1

-7|-

8.9

-83.5|2

.9|2

.6|2

.3|7

bqp100-3

101|1

01

3441|2

664|2

505|2

627|5

801

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.5-7

-9.9

-7|-

4.6

-7|-

1.3

-7|-

1.5

-7|3

.7-7

3.9|3

.1|2

.9|3|6

.4

bqp100-4

101|1

01

2287|1

683|1

850|1

614|5

602

9.9

-7|9

.9-7|9

.9-7|9

.9-7|5

.1-7

-3.4

-7|-

2.0

-7|-

6.9

-7|-

8.8

-7|4

.6-8

2.5|1

.9|2

.3|1

.9|6

.1

bqp100-5

101|1

01

2280|1

755|1

838|1

786|5

402

9.9

-7|9

.9-7|9

.9-7|9

.9-7|6

.5-7

-8.8

-7|-

2.0

-6|-

1.4

-6|-

1.3

-6|7

.3-8

2.6|2|2

.3|2|5

.9

bqp100-6

101|1

01

2198|1

819|1

658|1

606|6

487

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.0

-7|-

7.2

-7|-

6.4

-7|1

.4-7|-

5.4

-72.4|2

.1|2|1

.8|7

.1

bqp100-7

101|1

01

2469|1

888|1

895|1

830|5

440

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.6-7

3.0

-7|-

2.2

-6|3

.3-7|-

9.2

-8|-

3.4

-72.8|2

.2|2

.4|2

.1|6

bqp100-8

101|1

01

2611|1

923|1

687|1

716|5

753

9.9

-7|9

.6-7|9

.0-7|9

.9-7|9

.9-7

-2.9

-7|-

1.9

-7|-

1.2

-6|2

.2-6|4

.2-8

2.9|2

.2|2

.3|2|6

.2

bqp100-9

101|1

01

3622|2

840|2

564|2

779|5

605

9.9

-7|9

.9-7|9

.4-7|9

.9-7|9

.9-7

-4.2

-9|-

3.2

-8|-

7.9

-7|-

8.4

-8|-

3.4

-74|3

.1|3

.3|3

.1|6

.2

bqp100-1

0101|1

01

3164|2

366|2

447|2

392|5

301

9.9

-7|9

.7-7|9

.9-7|9

.9-7|9

.1-7

-1.1

-6|-

8.7

-7|-

1.6

-6|-

1.3

-6|-

2.5

-73.6|2

.8|3|2

.7|5

.8

bqp250-1

251|2

51

6482|4

851|4

394|4

427|6

962

9.9

-7|9

.9-7|9

.9-7|9

.7-7|9

.8-7

2.7

-6|-

4.5

-7|-

9.7

-7|-

5.4

-7|2

.4-7

30|2

2.5|2

1.2|2

1.8|3

3.1

bqp250-2

251|2

51

6081|4

463|4

135|4

036|6

953

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.2

-6|-

3.6

-7|-

3.2

-7|-

8.0

-7|3

.6-7

28.3|2

0.6|2

0|1

8.9|3

3

bqp250-3

251|2

51

6863|4

596|4

345|4

233|6

201

9.8

-7|9

.9-7|9

.7-7|9

.8-7|9

.9-7

-2.3

-6|-

2.5

-8|-

5.8

-7|-

5.8

-7|-

3.1

-732.9|2

1.4|2

2.1|2

0.8|2

9.5

bqp250-4

251|2

51

4934|3

732|2

857|2

771|6

601

9.9

-7|9

.9-7|9

.5-7|9

.6-7|9

.6-7

-1.1

-6|-

7.1

-7|7

.9-7|1

.4-7|5

.4-7

23.6|1

7.9|1

5.1|1

4.3|3

1.3

bqp250-5

251|2

51

7959|5

424|4

757|4

589|7

019

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

5.8

-6|-

5.1

-7|-

1.8

-6|-

1.6

-6|-

3.0

-736.6|2

4.9|2

2.2|2

1.2|3

1.8

bqp250-6

251|2

51

4656|3

531|3

325|3

169|6

085

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.6

-7|-

1.4

-6|-

2.6

-7|-

8.7

-7|9

.2-8

22.6|1

7.2|1

6.5|1

5.4|2

8.9

bqp250-7

251|2

51

6554|4

700|4

349|4

320|7

014

9.9

-7|9

.5-7|9

.9-7|9

.9-7|9

.9-7

-8.3

-7|-

4.7

-7|-

1.7

-6|-

2.1

-6|5

.9-7

31.9|2

2.4|2

0.7|2

0.2|3

2.4

bqp250-8

251|2

51

3942|3

241|3

047|2

944|6

201

9.8

-7|9

.9-7|9

.9-7|9

.9-7|9

.1-7

-5.2

-7|-

9.4

-7|-

7.9

-7|-

8.8

-7|-

4.3

-719.4|1

5.8|1

6.2|1

4.4|3

0.2

bqp250-9

251|2

51

7165|5

012|4

383|4

350|7

601

9.9

-7|9

.9-7|9

.9-7|9

.8-7|9

.5-7

-6.3

-7|-

5.7

-7|-

6.3

-7|7

.8-8|4

.1-7

33.6|2

4|2

0.7|2

1.5|3

5

bqp250-1

0251|2

51

5014|3

660|2

619|3

128|7

001

9.9

-7|9

.6-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-6|-

3.6

-7|-

1.6

-6|-

9.8

-7|-

3.7

-723.4|1

8.6|1

3.9|1

5.1|3

2.3

bqp500-1

501|5

01

10132|7

274|6

866|6

662|7

301

9.9

-7|9

.8-7|9

.8-7|9

.9-7|9

.9-7

2.9

-6|3

.6-7|-

1.8

-7|-

4.1

-7|5

.9-7

4:2

0|3

:16|4

:01|2

:52|3

:07

bqp500-2

501|5

01

15240|8

644|7

766|7

958|7

801

9.9

-7|9

.9-7|9

.6-7|9

.9-7|8

.5-7

-2.9

-6|3

.8-7|9

.1-7|4

.8-7|-

1.1

-66:2

5|3

:40|3

:43|3

:23|3

:19

bqp500-3

501|5

01

12690|9

531|8

091|8

112|7

983

9.9

-7|9

.9-7|9

.9-7|9

.8-7|9

.9-7

-3.2

-6|-

9.5

-7|-

1.1

-6|-

3.3

-7|1

.5-7

5:3

1|4

:07|3

:36|3

:39|3

:33

bqp500-4

501|5

01

14051|8

963|8

024|7

716|7

323

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

3.6

-6|-

5.3

-8|3

.4-7|9

.5-8|4

.0-7

6:1

2|4

:00|3

:39|3

:23|3

:12

43

Page 44: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

3(c

onti

nued)

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

bqp500-5

501|5

01

12010|8

797|7

432|7

264|7

601

9.9

-7|9

.9-7|9

.9-7|9

.8-7|8

.1-7

7.0

-6|-

3.1

-7|1

.3-7|-

3.4

-7|-

7.6

-75:1

3|3

:57|3

:19|3

:17|3

:19

bqp500-6

501|5

01

10177|7

333|7

396|6

973|7

353

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.2

-6|-

7.1

-7|-

1.2

-6|-

4.5

-7|-

3.2

-74:2

8|3

:20|3

:24|3

:06|3

:20

bqp500-7

501|5

01

10795|7

128|6

915|6

915|6

961

9.9

-7|9

.6-7|9

.8-7|9

.9-7|9

.9-7

3.7

-7|1

.6-6|-

1.0

-6|-

8.1

-7|-

8.9

-74:4

8|3

:15|3

:12|3

:05|3

:06

bqp500-8

501|5

01

9848|7

942|7

420|7

241|7

195

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.4

-6|5

.3-7|-

1.7

-7|-

2.8

-7|-

7.8

-74:2

6|3

:34|3

:35|3

:16|3

:14

bqp500-9

501|5

01

10779|7

312|6

950|7

121|7

357

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

5.5

-7|-

5.1

-7|-

4.6

-7|-

3.9

-7|4

.0-7

4:5

2|3

:26|3

:05|3

:13|3

:19

bqp500-1

0501|5

01

17191|9

623|8

354|8

227|7

801

9.7

-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

-1.6

-6|-

8.5

-7|-

4.9

-7|8

.3-7|1

.7-6

7:4

6|4

:22|3

:36|3

:45|3

:29

gka1a

51|5

13485|2

877|2

632|2

594|3

679

8.7

-7|8

.7-7|9

.4-7|7

.7-7|9

.9-7

-1.5

-6|2

.4-6|2

.7-6|2

.4-6|2

.0-7

1.8|1

.5|1

.5|1

.4|1

.8

gka2a

61|6

13690|2

789|2

572|2

514|4

201

9.8

-7|9

.9-7|9

.7-7|9

.5-7|9

.6-7

2.8

-6|8

.6-7|1

.2-6|7

.1-7|5

.6-7

2.2|1

.6|1

.7|1

.6|2

.4

gka3a

71|7

11569|1

323|1

101|1

083|4

901

9.0

-7|8

.0-7|9

.9-7|7

.2-7|9

.9-7

1.2

-6|1

.4-6|2

.9-6|1

.6-6|-

1.3

-71.3|1

.1|0

.9|0

.8|3

.4

gka4a

81|8

12573|2

215|1

942|1

866|5

501

9.6

-7|9

.1-7|9

.6-7|8

.6-7|9

.5-7

9.6

-7|-

1.6

-6|2

.5-6|1

.7-6|4

.6-8

2.2|2|1

.9|1

.6|4

.3

gka5a

51|5

11857|1

334|1

272|1

241|5

974

9.9

-7|9

.9-7|9

.5-7|8

.7-7|9

.9-7

-1.2

-6|-

2.3

-6|1

.8-7|4

.4-7|5

.4-8

1|0

.7|0

.8|0

.7|3

.1

gka6a

31|3

1842|7

20|6

90|8

93|6

901

8.7

-7|8

.7-7|8

.3-7|9

.9-7|9

.4-7

7.0

-7|-

1.1

-6|2

.9-7|-

2.3

-7|5

.1-8

0.3|0

.3|0

.4|0

.3|2

.4

gka7a

31|3

1887|7

11|6

65|8

87|5

901

9.9

-7|8

.9-7|8

.7-7|9

.9-7|9

.4-7

-1.7

-6|2

.9-7|5

.8-7|4

.1-7|5

.4-7

0.3|0

.3|0

.3|0

.3|2

.1

gka8a

101|1

01

4405|3

342|3

178|3

009|5

653

9.9

-7|9

.8-7|9

.7-7|9

.9-7|9

.4-7

1.1

-6|-

3.7

-6|2

.4-6|-

3.9

-6|4

.7-7

5.1|4|3

.9|3

.6|6

.5

gka1b

21|2

1312|2

65|2

65|2

87|4

801

9.7

-7|9

.9-7|9

.8-7|9

.9-7|9

.5-7

-1.1

-6|-

7.2

-7|-

7.5

-7|2

.7-7|-

2.4

-60.1|0

.1|0

.1|0

.1|1

.5

gka2b

31|3

1427|3

35|3

24|7

93|6

801

9.7

-7|9

.9-7|9

.9-7|9

.9-7|9

.6-7

-1.9

-6|-

1.7

-6|-

1.4

-6|5

.8-7|2

.2-7

0.2|0

.1|0

.2|0

.3|2

.5

gka3b

41|4

1318|2

39|2

44|6

01|5

101

9.7

-7|8

.7-7|9

.7-7|8

.3-7|9

.7-7

-2.1

-6|-

2.2

-6|-

1.5

-6|-

4.1

-7|-

1.1

-60.2|0

.1|0

.1|0

.3|2

.1

gka4b

51|5

1459|3

97|3

71|7

98|5

997

9.7

-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7

-2.3

-6|-

1.9

-6|-

2.1

-6|-

5.3

-7|4

.6-7

0.3|0

.2|0

.2|0

.4|3

.1

gka5b

61|6

1434|3

64|3

33|8

79|7

101

9.9

-7|9

.7-7|9

.7-7|9

.9-7|9

.5-7

-1.2

-6|-

1.5

-6|-

8.5

-7|5

.3-7|1

.2-7

0.3|0

.2|0

.3|0

.6|4

.3

gka6b

71|7

1475|4

01|3

93|8

01|7

501

9.6

-7|9

.4-7|9

.9-7|8

.0-7|8

.9-7

-3.7

-6|-

2.5

-6|-

2.8

-6|-

6.4

-7|-

3.8

-70.4|0

.3|0

.4|0

.6|5

.3

gka7b

81|8

1526|4

42|4

11|8

61|7

487

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7

-1.2

-6|-

1.2

-6|-

1.0

-6|2

.7-7|-

6.9

-70.5|0

.4|0

.5|0

.8|6

.3

gka8b

91|9

1584|4

80|4

69|8

64|7

366

9.9

-7|9

.7-7|9

.8-7|9

.9-7|9

.9-7

-2.2

-6|-

2.2

-6|-

2.6

-6|-

3.6

-7|-

8.4

-70.6|0

.5|0

.6|0

.9|7

.4

gka9b

101|1

01

594|4

57|4

40|8

95|7

601

9.9

-7|9

.8-7|9

.7-7|9

.9-7|9

.7-7

-3.2

-6|-

4.4

-6|-

2.7

-6|4

.3-7|1

.8-7

0.7|0

.5|0

.6|1|8

gka10b

126|1

26

736|6

03|6

06|8

66|7

440

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.8

-6|-

3.8

-6|-

3.8

-6|1

.0-6|-

9.8

-71.2|1|1

.1|1

.4|1

0.9

gka1c

41|4

12085|1

590|1

524|1

461|5

151

9.3

-7|9

.0-7|9

.8-7|8

.9-7|9

.9-7

1.3

-6|-

1.5

-6|-

3.1

-6|-

1.2

-6|-

4.6

-70.9|0

.7|0

.7|0

.6|2

gka2c

51|5

12113|1

751|1

608|1

499|5

001

9.2

-7|9

.9-7|9

.3-7|9

.0-7|9

.9-7

-1.3

-6|4

.1-7|6

.9-7|-

9.8

-7|-

5.6

-71.2|0

.9|0

.9|0

.8|2

.4

gka3c

61|6

11687|1

335|1

180|1

206|4

701

8.8

-7|9

.7-7|9

.8-7|9

.4-7|9

.8-7

-1.4

-6|-

3.3

-6|3

.0-6|-

2.2

-6|-

5.5

-71.2|0

.9|0

.9|0

.8|3

gka4c

71|7

12402|1

761|1

683|1

633|4

413

9.9

-7|9

.0-7|9

.8-7|9

.7-7|9

.9-7

1.7

-6|1

.1-6|2

.9-6|3

.0-6|4

.5-7

1.8|1

.3|1

.3|1

.2|3

.2

gka5c

81|8

12201|1

819|1

598|1

523|4

908

9.9

-7|9

.9-7|9

.9-7|9

.9-7|8

.6-7

-6.8

-7|-

1.8

-7|-

1.1

-6|-

3.8

-7|2

.7-7

1.9|1

.5|1

.6|1

.3|4

.3

gka6c

91|9

12834|2

648|2

236|2

267|6

008

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-4.6

-7|-

3.8

-7|-

1.0

-7|-

2.6

-7|-

5.2

-72.9|2

.6|2

.4|2

.4|6

.3

gka7c

101|1

01

3073|2

369|2

109|2

165|4

651

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.8

-7|-

2.4

-6|3

.2-6|-

3.6

-6|2

.2-7

3.6|2

.8|3|2

.7|5

.5

gka1d

101|1

01

3330|2

845|2

622|2

422|4

801

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

-2.4

-7|-

4.8

-7|-

6.2

-7|-

2.6

-7|-

3.2

-83.9|3

.4|3

.7|2

.9|5

.5

gka2d

101|1

01

1878|1

358|1

409|1

298|5

201

9.9

-7|9

.0-7|9

.9-7|9

.9-7|9

.7-7

-2.2

-7|-

4.8

-7|-

7.2

-7|9

.3-8|-

1.0

-72.2|1

.7|2

.1|1

.5|5

.9

gka3d

101|1

01

1785|1

382|1

259|1

165|5

522

9.3

-7|8

.9-7|9

.0-7|9

.0-7|9

.9-7

-7.1

-7|1

.2-7|2

.7-7|-

2.6

-6|4

.5-8

2.2|1

.7|1

.9|1

.4|6

gka4d

101|1

01

1745|1

462|1

420|1

293|5

102

9.3

-7|9

.9-7|9

.9-7|9

.9-7|6

.1-7

-1.0

-6|-

7.6

-7|-

4.8

-7|-

1.5

-6|4

.2-8

2.1|1

.6|1

.8|1

.5|5

.5

gka5d

101|1

01

2025|1

669|1

537|1

463|5

801

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.4-7

-3.5

-7|-

2.8

-7|-

3.3

-7|-

1.4

-7|-

3.6

-82.4|2|2

.1|1

.7|6

.6

gka6d

101|1

01

2556|2

131|1

935|1

961|6

401

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

-4.0

-9|-

3.7

-7|-

8.6

-7|-

3.5

-7|-

5.3

-72.9|2

.4|2

.7|2

.3|7

.4

gka7d

101|1

01

2133|1

826|1

694|1

636|5

005

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

6.3

-7|-

4.1

-7|-

3.3

-7|-

6.1

-7|4

.3-8

2.6|2

.2|2

.5|2|6

gka8d

101|1

01

2566|2

111|2

103|2

041|6

001

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

-9.6

-7|-

6.3

-7|-

8.4

-7|-

9.7

-7|5

.0-7

3|2

.4|2

.8|2

.5|7

.1

gka9d

101|1

01

2197|1

771|1

491|1

475|4

801

9.9

-7|9

.9-7|9

.7-7|9

.9-7|9

.0-7

-4.2

-7|-

2.4

-7|3

.6-7|-

1.4

-7|-

4.1

-72.6|2

.1|2

.2|1

.7|5

.5

gka10d

101|1

01

2451|1

807|1

764|1

785|6

039

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.4

-7|9

.2-7|-

8.3

-7|-

4.6

-7|-

5.4

-72.9|2

.1|2

.4|2

.1|6

.9

gka1e

201|2

01

4510|3

288|3

422|3

195|6

441

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

1.6

-6|-

1.0

-6|-

1.1

-6|-

4.5

-7|-

3.7

-715.3|1

1.4|1

3.5|1

1|2

2.5

gka2e

201|2

01

4104|3

197|2

949|2

787|6

001

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.0-7

-1.5

-7|-

1.1

-6|-

8.1

-7|-

6.5

-7|-

2.0

-713.9|1

1|1

1.7|9

.5|2

0

44

Page 45: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

3(c

onti

nued)

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

gka3e

201|2

01

3413|2

710|2

638|2

661|5

801

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.6-7

-5.6

-7|-

5.8

-7|-

5.9

-7|-

6.1

-7|4

.2-7

11.5|9

.2|9

.7|9

.1|1

9.4

gka4e

201|2

01

4326|3

273|3

044|2

950|6

122

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

-3.5

-6|-

4.0

-7|1

.2-7|8

.3-8|3

.5-7

15.1|1

1.5|1

1|1

0.3|2

1.7

gka5e

201|2

01

3556|2

292|2

712|2

624|6

201

9.9

-7|9

.4-7|9

.9-7|9

.9-7|9

.2-7

5.1

-7|7

.8-7|-

6.2

-7|-

6.7

-7|5

.8-7

11.8|8

.3|8

.9|8

.8|1

9.9

gka1f

501|5

01

11283|8

087|6

681|6

537|7

101

9.9

-7|9

.9-7|9

.8-7|9

.8-7|8

.8-7

1.3

-6|-

7.5

-7|-

9.4

-8|-

8.8

-7|8

.3-7

5:0

5|3

:39|3

:17|3

:03|3

:12

gka2f

501|5

01

11541|8

035|7

353|6

920|7

739

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

1.2

-6|3

.1-7|-

5.6

-7|-

1.4

-6|2

.6-7

5:1

9|3

:51|3

:39|3

:11|3

:39

gka3f

501|5

01

11020|7

870|6

705|6

713|6

901

9.9

-7|9

.9-7|9

.8-7|9

.8-7|9

.5-7

4.4

-6|-

4.9

-7|2

.8-7|-

8.5

-8|-

8.4

-74:5

9|3

:33|3

:18|3

:08|3

:07

gka4f

501|5

01

10970|7

607|7

036|7

157|7

301

9.9

-7|9

.9-7|9

.7-7|9

.9-7|8

.3-7

1.5

-6|-

1.4

-7|3

.8-7|-

8.2

-7|-

6.7

-75:0

9|3

:34|3

:37|3

:17|3

:20

gka5f

501|5

01

10951|7

130|6

926|6

689|7

058

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

1.5

-7|3

.7-7|5

.5-7|6

.9-7|-

4.9

-75:0

9|3

:27|3

:28|3

:07|3

:16

soyb

ean-s

mall-2

1176|4

72536|2

646|2

654|1

478|2

945

9.9

-7|8

.1-7|7

.5-7|9

.5-7|9

.9-7

1.1

-7|6

.4-8|6

.0-8|1

.1-7|-

1.3

-93.4|2

.4|2

.3|1

.2|1

.9

soyb

ean-s

mall-3

1176|4

7221|1

67|1

52|4

07|3

177

9.4

-7|6

.1-7|8

.4-7|9

.9-7|9

.9-7

-2.6

-8|5

.1-7|-

6.1

-7|1

.5-9|1

.6-9

0.2|0

.2|0

.2|0

.3|1

.9

soyb

ean-s

mall-4

1176|4

71271|1

084|1

089|1

088|1

558

9.9

-7|8

.4-7|8

.1-7|8

.1-7|9

.9-7

-7.4

-8|1

.9-8|2

.4-8|2

.4-8|-

1.6

-10

1.2|1

.1|1

.1|1

.1|1

.2

soyb

ean-s

mall-5

1176|4

71732|1

057|9

08|8

70|4

369

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.9

-8|-

3.9

-8|-

3.9

-8|-

3.8

-8|-

2.9

-10

1.4|0

.9|0

.7|0

.8|3

.5

soyb

ean-s

mall-6

1176|4

74366|2

634|2

252|2

153|5

171

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.1

-8|-

8.2

-8|-

8.3

-8|-

8.3

-8|3

.6-1

24|2

.3|1

.8|1

.7|4

soyb

ean-s

mall-7

1176|4

72343|1

391|1

191|1

139|4

353

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.0

-8|-

3.0

-8|-

3.0

-8|-

3.0

-8|-

1.5

-10

1.8|1

.1|1|0

.9|3

.3

soyb

ean-s

mall-8

1176|4

76274|3

717|3

174|3

034|4

379

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.4

-8|-

7.6

-8|-

7.7

-8|-

7.7

-8|-

1.5

-84.9|2

.8|2

.5|2

.9|3

.8

soyb

ean-s

mall-9

1176|4

74255|2

498|2

136|2

042|4

489

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.0

-7|-

2.0

-7|-

2.0

-7|-

2.0

-7|-

2.6

-10

3.6|2

.1|1

.8|1

.8|3

.5

soyb

ean-s

mall-1

01176|4

73188|2

180|1

865|1

783|5

198

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.0

-7|-

3.0

-7|-

3.0

-7|-

3.0

-7|-

5.9

-11

2.5|1

.7|1

.5|1

.4|3

.9

soyb

ean-s

mall-1

11176|4

74199|2

879|2

460|2

351|4

724

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.3

-7|-

2.3

-7|-

2.3

-7|-

2.3

-7|-

1.5

-93.5|2

.3|1

.9|1

.8|3

.8

soyb

ean-l

arg

e-2

47586|3

07

3819|3

655|3

658|3

660|5

065

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-9.0

-8|-

8.9

-8|-

9.5

-8|-

9.8

-8|-

5.4

-853.3|5

2|4

9.7|4

5.6|5

5.5

soyb

ean-l

arg

e-3

47586|3

07

3268|3

256|3

258|1

852|3

651

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.9

-8|-

4.0

-8|-

4.0

-8|-

2.7

-8|-

3.5

-847.2|4

6.4|4

5.1|2

1.6|4

1.2

soyb

ean-l

arg

e-4

47586|3

07

6819|7

454|7

457|7

458|7

479

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.2

-8|-

7.1

-8|-

7.1

-8|-

7.0

-8|-

7.0

-81:4

5|1

:58|1

:55|1

:52|1

:43

soyb

ean-l

arg

e-5

47586|3

07

2611|2

788|2

818|2

261|3

793

9.8

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-4.2

-8|-

4.4

-8|-

4.4

-8|-

6.4

-8|-

2.7

-836|3

9.2|3

9.6|2

8.7|4

6.2

soyb

ean-l

arg

e-6

47586|3

07

998|1

092|8

64|8

17|5

964

9.9

-7|6

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.0

-8|-

2.6

-8|-

5.1

-8|-

5.2

-8|-

6.1

-10

12.9|1

6|1

1.5|1

0.2|1

:10

soyb

ean-l

arg

e-7

47586|3

07

2423|2

606|2

628|2

633|5

066

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.7

-8|-

5.6

-8|-

5.5

-8|-

5.5

-8|-

1.8

-837|4

1.3|4

0.6|3

7.5|1

:04

soyb

ean-l

arg

e-8

47586|3

07

3466|3

551|3

550|3

550|5

482

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.9

-8|-

2.8

-8|-

2.8

-8|-

2.7

-8|-

7.3

-954.8|5

6.6|5

5.9|5

3.9|1

:09

soyb

ean-l

arg

e-9

47586|3

07

4426|4

483|4

467|4

464|6

270

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.1

-8|-

3.2

-8|-

3.2

-8|-

3.1

-8|-

1.3

-81:0

8|1

:11|1

:10|1

:07|1

:19

soyb

ean-l

arg

e-1

047586|3

07

749|7

27|7

38|7

42|5

812

9.4

-7|9

.2-7|9

.9-7|9

.3-7|9

.9-7

1.0

-7|-

6.1

-9|5

.6-9|1

.8-8|2

.3-9

9.9|1

0.1|1

0.2|9

.7|1

:11

soyb

ean-l

arg

e-1

147586|3

07

4506|2

745|2

351|2

252|5

771

9.5

-7|9

.9-7|9

.9-7|9

.4-7|9

.9-7

3.8

-7|9

.9-8|-

3.1

-7|-

3.8

-7|-

3.3

-958.3|3

6.5|3

1.1|2

9.5|1

:13

spam

base

-sm

all-2

45451|3

00

986|9

70|9

46|9

27|4

194

9.1

-7|9

.7-7|9

.9-7|9

.9-7|9

.9-7

-3.3

-8|-

1.9

-7|-

4.4

-7|-

4.4

-7|7

.2-1

110.5|1

0.2|1

0.1|1

0|4

3.4

spam

base

-sm

all-3

45451|3

00

1034|1

420|1

111|1

174|5

885

9.9

-7|9

.7-7|9

.7-7|9

.9-7|9

.9-7

-1.4

-7|-

5.0

-8|-

7.3

-8|-

7.9

-8|-

4.6

-911.9|1

8.5|1

3|1

3.1|1

:02

spam

base

-sm

all-4

45451|3

00

2786|2

730|2

707|2

700|6

178

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.2

-8|-

3.4

-8|-

3.4

-8|-

3.4

-8|-

1.5

-834.5|3

6.3|3

5.4|3

3.9|1

:10

spam

base

-sm

all-5

45451|3

00

3522|2

107|1

804|1

726|6

539

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.3

-7|-

1.4

-7|-

1.1

-7|-

1.8

-7|-

1.6

-10

47.5|2

7|2

1.4|2

1.7|1

:17

spam

base

-sm

all-6

45451|3

00

4535|2

679|2

292|2

192|6

402

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-9.0

-8|-

9.9

-8|-

1.0

-7|-

1.0

-7|-

2.1

-91:0

6|3

8.1|3

0.3|2

8.5|1

:19

spam

base

-sm

all-7

45451|3

00

7587|4

998|4

265|4

075|6

402

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.5

-7|-

2.5

-7|-

2.5

-7|-

2.4

-7|-

7.5

-91:3

4|1

:03|5

4.5|4

9.2|1

:17

spam

base

-sm

all-8

45451|3

00

3974|2

434|2

083|1

994|5

921

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.9

-7|-

1.5

-7|-

9.4

-8|-

2.6

-7|3

.3-1

149.2|3

0.1|2

5.8|2

4.6|1

:11

spam

base

-sm

all-9

45451|3

00

6817|4

209|3

594|3

435|6

503

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.3

-7|-

2.4

-7|-

2.4

-7|-

2.4

-7|-

1.7

-91:2

5|5

2.5|4

5.2|4

2.8|1

:20

spam

base

-sm

all-1

045451|3

00

6227|3

998|3

414|3

263|5

818

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.5

-7|-

1.5

-7|-

1.5

-7|-

1.5

-7|-

2.7

-91:2

2|5

2|4

4.2|4

2.4|1

:15

spam

base

-sm

all-1

145451|3

00

7947|5

234|4

469|4

270|6

171

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.7

-7|-

1.7

-7|-

1.7

-7|-

1.7

-7|-

1.7

-81:4

4|1

:08|5

9|5

6.3|1

:20

spam

base

-mediu

m-2

406351|9

00

1110|7

59|7

26|7

16|5

363

9.9

-7|9

.8-7|9

.5-7|9

.9-7|9

.9-7

-2.2

-6|-

4.4

-6|-

1.9

-6|-

1.3

-6|4

.0-1

03:2

3|2

:21|2

:15|2

:13|1

6:0

5

spam

base

-mediu

m-3

406351|9

00

1812|2

695|2

391|2

487|5

127

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.4

-7|-

4.3

-7|-

3.1

-8|-

5.4

-7|-

9.7

-85:3

0|8

:50|7

:25|7

:35|1

5:2

9

spam

base

-mediu

m-4

406351|9

00

4370|4

153|4

139|4

110|6

406

9.8

-7|9

.9-7|9

.8-7|9

.4-7|9

.9-7

-2.3

-6|2

.3-6|-

2.2

-6|8

.3-7|-

2.7

-913:5

3|1

2:5

7|1

2:5

9|1

2:4

9|1

9:2

7

spam

base

-mediu

m-5

406351|9

00

2691|2

948|2

013|2

028|7

137

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7

3.4

-8|-

5.9

-8|-

2.3

-8|-

1.3

-7|-

8.9

-98:2

3|1

0:0

4|6

:22|6

:21|2

2:0

2

spam

base

-mediu

m-6

406351|9

00

4759|2

640|2

231|2

148|6

879

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.1

-7|-

2.0

-7|-

2.4

-7|-

2.5

-7|-

5.5

-914:5

4|8

:18|7

:02|6

:47|2

1:3

5

45

Page 46: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

3(c

onti

nued)

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

spam

base

-mediu

m-7

406351|9

00

5744|3

884|3

234|2

735|6

550

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.5

-7|-

2.4

-7|-

2.5

-7|-

3.2

-7|-

1.6

-918:0

9|1

2:1

9|1

0:1

9|8

:37|2

0:3

3

spam

base

-mediu

m-8

406351|9

00

7153|4

197|3

474|3

325|6

663

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.0

-7|-

1.9

-7|-

2.0

-7|-

2.2

-7|-

5.6

-10

22:2

4|1

3:1

3|1

0:5

7|1

0:0

4|2

0:0

1

spam

base

-mediu

m-9

406351|9

00

9345|5

145|4

181|4

008|6

607

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.4

-7|-

2.6

-7|-

3.3

-7|-

3.2

-7|-

2.5

-928:1

2|1

5:3

6|1

2:4

1|1

2:1

1|1

9:5

9

spam

base

-mediu

m-1

0406351|9

00

12192|6

592|5

466|5

296|6

511

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.3

-7|-

1.1

-7|-

1.0

-7|-

1.0

-7|-

2.4

-836:5

5|1

9:5

8|1

6:3

4|1

6:0

1|1

9:4

0

spam

base

-mediu

m-1

1406351|9

00

6715|4

640|4

070|3

941|6

431

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

8.8

-8|-

1.4

-7|-

3.8

-7|-

2.8

-8|1

.1-9

20:1

2|1

3:5

8|1

2:1

4|1

1:5

4|1

9:1

4

spam

base

-larg

e-2

1127251|1

500

1108|7

95|7

19|6

82|5

395

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

-8.0

-7|1

.3-7|-

1.3

-6|-

1.9

-6|5

.2-1

111:2

8|8

:16|7

:32|7

:12|5

2:3

8

spam

base

-larg

e-3

1127251|1

500

2217|1

709|1

530|1

504|7

218

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

3.8

-6|-

3.7

-6|3

.1-6|2

.5-6|2

.3-1

022:2

1|1

7:1

0|1

5:2

7|1

5:1

2|

1:1

0:5

0

spam

base

-larg

e-4

1127251|1

500

4649|4

376|4

382|4

365|7

442

9.5

-7|9

.8-7|9

.9-7|9

.6-7|9

.9-7

-1.6

-6|2

.8-6|3

.2-6|2

.9-6|-

5.3

-846:0

0|4

3:3

1|4

6:3

8|4

7:4

6|

1:1

9:5

6

spam

base

-larg

e-5

1127251|1

500

13500|1

4373|1

4598|1

4555|1

4583

9.9

-7|9

.5-7|9

.8-7|9

.9-7|9

.9-7

2.5

-6|-

2.4

-6|2

.5-6|-

2.4

-6|2

.4-6

2:2

2:1

1|

2:2

5:3

1|

2:2

7:3

7|

2:2

4:5

7|

2:2

5:1

2

spam

base

-larg

e-6

1127251|1

500

5485|3

448|2

725|2

571|7

026

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|-

9.6

-8|-

5.9

-7|7

.7-7|-

6.8

-854:4

9|3

4:4

6|2

7:3

2|2

6:0

2|

1:0

9:5

4

spam

base

-larg

e-7

1127251|1

500

6559|4

260|3

965|3

848|6

896

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.3

-7|-

1.6

-7|-

1.4

-7|-

1.3

-7|1

.5-9

1:0

8:0

3|4

5:0

3|4

1:5

7|4

0:4

2|

1:1

3:2

3

spam

base

-larg

e-8

1127251|1

500

8388|5

745|5

395|5

261|6

608

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.3

-7|-

1.3

-7|-

1.2

-7|-

1.2

-7|-

1.8

-81:3

4:5

1|

1:0

2:0

5|5

8:2

9|5

7:4

5|

1:1

2:5

8

spam

base

-larg

e-9

1127251|1

500

13890|8

547|7

973|7

780|7

803

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.1

-7|-

2.1

-7|-

2.1

-7|-

2.1

-7|-

2.1

-72:2

8:5

7|

1:2

9:1

3|

1:2

1:5

4|

1:2

0:0

5|

1:2

0:1

0

spam

base

-larg

e-1

01127251|1

500

22504|1

3523|1

2685|1

2412|1

2384

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.0

-7|-

1.9

-7|-

2.0

-7|-

1.9

-7|-

1.9

-73:5

3:3

0|

2:3

8:2

9|

2:2

0:3

6|

2:3

3:3

1|

2:1

9:2

8

spam

base

-larg

e-1

11127251|1

500

10722|6

408|6

144|6

073|6

410

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.1

-7|6

.6-7|8

.9-8|-

5.8

-7|-

2.5

-71:5

3:1

8|

1:0

7:2

2|

1:0

4:4

5|

1:0

3:4

0|

1:0

7:1

3

abalo

ne-s

mall-2

20301|2

00

867|9

01|9

04|8

69|3

282

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-7|-

3.5

-8|-

3.1

-8|-

9.9

-8|7

.6-1

24.6|5

.3|5

.3|5|1

6.6

abalo

ne-s

mall-3

20301|2

00

909|5

77|5

02|5

49|4

771

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.9

-8|-

3.6

-8|-

3.5

-8|-

1.5

-8|-

2.8

-11

4.9|3

.1|2

.7|3|2

4.7

abalo

ne-s

mall-4

20301|2

00

3063|1

798|1

540|1

474|5

180

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.9

-8|-

8.9

-8|-

8.9

-8|-

8.9

-8|-

4.0

-11

16.9|9

.9|8

.5|8

.1|2

7.7

abalo

ne-s

mall-5

20301|2

00

4498|2

975|2

543|2

431|5

791

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.1

-8|-

5.1

-8|-

5.1

-8|-

5.1

-8|-

2.7

-10

24.9|1

6.4|1

4.2|1

3.6|3

2.7

abalo

ne-s

mall-6

20301|2

00

5194|3

126|2

672|2

554|5

352

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.3

-7|-

1.3

-7|-

1.3

-7|-

1.3

-7|-

2.1

-10

29|1

7.5|1

4.9|1

4.4|2

9.5

abalo

ne-s

mall-7

20301|2

00

15401|8

937|7

587|7

235|7

217

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.0

-7|-

1.0

-7|-

1.0

-7|-

1.0

-7|-

1.0

-71:2

9|5

1.6|4

4|4

2|4

2

abalo

ne-s

mall-8

20301|2

00

21464|1

3345|1

1212|1

0689|1

0639

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.5

-7|-

2.5

-7|-

2.5

-7|-

2.5

-7|-

2.5

-72:0

6|1

:19|1

:06|1

:03|1

:03

abalo

ne-s

mall-9

20301|2

00

25207|1

7213|1

4449|1

3743|1

3675

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.2

-7|-

3.2

-7|-

3.2

-7|-

3.2

-7|-

3.2

-72:3

1|1

:42|1

:26|1

:21|1

:20

abalo

ne-s

mall-1

020301|2

00

15043|8

703|7

369|7

032|7

004

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|-

1.4

-7|-

1.4

-7|-

1.5

-7|-

1.4

-71:3

0|5

2.1|4

3.9|4

2.1|4

2

abalo

ne-s

mall-1

120301|2

00

21997|1

3896|1

1717|1

1154|1

1100

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.4

-7|-

2.4

-7|-

2.4

-7|-

2.4

-7|-

2.4

-72:1

6|1

:26|1

:12|1

:10|1

:09

abalo

ne-m

ediu

m-2

80601|4

00

1262|1

081|1

081|1

046|1

848

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

7.3

-10|-

3.9

-8|-

4.9

-8|-

1.4

-7|-

2.7

-934.5|3

0|2

9.9|2

7.6|4

4

abalo

ne-m

ediu

m-3

80601|4

00

1854|1

729|1

724|1

722|4

171

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-8|-

1.5

-8|-

1.5

-8|-

1.5

-8|-

2.0

-954.1|5

3.2|5

3.2|5

3|1

:42

abalo

ne-m

ediu

m-4

80601|4

00

2029|1

227|1

054|1

009|3

939

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.1

-8|9

.5-9|9

.2-9|9

.1-9|9

.7-1

151.1|3

1.1|2

6.7|2

5.6|1

:38

abalo

ne-m

ediu

m-5

80601|4

00

5680|3

185|2

720|2

600|5

770

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.1

-7|-

2.1

-7|-

2.1

-7|-

2.1

-7|-

2.7

-10

2:2

4|1

:21|1

:10|1

:07|2

:27

abalo

ne-m

ediu

m-6

80601|4

00

8387|5

719|4

879|4

661|5

842

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.7

-7|-

1.7

-7|-

1.7

-7|-

1.7

-7|-

5.6

-83:3

5|2

:28|2

:06|2

:01|2

:32

abalo

ne-m

ediu

m-7

80601|4

00

14290|8

539|7

278|6

950|6

945

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|-

1.4

-7|-

1.4

-7|-

1.4

-7|-

1.4

-76:1

5|3

:45|3

:10|3

:02|3

:02

abalo

ne-m

ediu

m-8

80601|4

00

13224|7

557|6

409|6

113|6

156

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-7|-

1.1

-7|-

1.2

-7|-

1.2

-7|-

9.2

-85:5

5|3

:22|2

:52|2

:43|2

:45

abalo

ne-m

ediu

m-9

80601|4

00

24862|1

5368|1

2895|1

2286|1

2229

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.9

-7|-

1.9

-7|-

1.9

-7|-

1.9

-7|-

1.9

-711:2

4|7

:02|5

:55|5

:37|5

:37

abalo

ne-m

ediu

m-1

080601|4

00

44753|3

0224|2

5295|2

4031|2

3912

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.4

-7|-

2.4

-7|-

2.4

-7|-

2.4

-7|-

2.4

-720:1

8|1

3:5

8|1

1:2

4|1

0:5

6|1

0:4

9

abalo

ne-m

ediu

m-1

180601|4

00

38177|2

2500|2

1934|2

0841|2

0737

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.8

-7|-

1.8

-7|-

1.8

-7|-

1.8

-7|-

1.8

-717:3

3|1

0:2

0|1

0:0

4|9

:35|9

:37

abalo

ne-l

arg

e-2

501501|1

000

1045|1

122|1

145|1

155|1

991

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.8

-7|-

1.2

-7|-

1.2

-7|-

1.3

-7|-

5.9

-94:4

9|5

:24|5

:23|5

:36|9

:06

abalo

ne-l

arg

e-3

501501|1

000

1369|1

397|1

451|1

467|3

499

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|-

9.9

-8|-

9.6

-8|-

9.5

-8|-

8.8

-97:2

8|7

:42|7

:57|7

:41|1

4:0

8

abalo

ne-l

arg

e-4

501501|1

000

2814|1

676|1

437|1

375|4

178

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-9.7

-8|-

1.0

-7|-

1.0

-7|-

1.0

-7|-

4.1

-12

11:2

4|6

:56|5

:54|5

:39|1

6:5

2

abalo

ne-l

arg

e-5

501501|1

000

9065|5

080|4

334|4

140|5

868

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.1

-7|-

2.1

-7|-

2.1

-7|-

2.1

-7|-

3.1

-836:5

6|2

0:2

7|1

6:5

5|1

6:1

0|2

5:2

8

abalo

ne-l

arg

e-6

501501|1

000

10882|7

393|6

305|6

022|6

252

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.3

-7|-

1.3

-7|-

1.3

-7|-

1.3

-7|-

1.1

-743:4

8|2

8:4

6|2

4:3

1|2

3:2

7|2

4:2

0

abalo

ne-l

arg

e-7

501501|1

000

11625|7

032|6

006|5

740|5

845

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.6

-7|-

1.6

-7|-

1.6

-7|-

1.6

-7|-

1.5

-745:0

9|2

7:5

1|2

4:2

6|2

3:2

3|2

3:5

8

abalo

ne-l

arg

e-8

501501|1

000

23607|1

3577|1

1481|1

0945|1

0894

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-4.3

-7|-

4.3

-7|-

4.3

-7|-

4.3

-7|-

4.3

-71:3

7:3

3|5

2:4

7|4

4:4

0|4

2:2

5|4

2:1

6

abalo

ne-l

arg

e-9

501501|1

000

38356|2

3874|2

0166|1

9227|1

9138

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.9

-7|-

2.9

-7|-

2.9

-7|-

2.9

-7|-

2.9

-72:2

9:3

6|

1:3

3:0

9|

1:1

8:4

6|

1:1

5:1

1|

1:2

5:4

8

46

Page 47: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

3(c

onti

nued)

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

abalo

ne-l

arg

e-1

0501501|1

000

32936|2

2015|1

8382|1

7449|1

7361

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.0

-7|-

3.0

-7|-

3.0

-7|-

3.0

-7|-

3.0

-72:2

8:4

5|

1:3

4:4

2|

1:1

8:1

0|

1:1

7:3

9|

1:1

6:0

2

abalo

ne-l

arg

e-1

1501501|1

000

35119|2

0956|1

9982|1

8968|1

8872

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.4

-7|-

2.4

-7|-

2.4

-7|-

2.4

-7|-

2.4

-72:2

7:0

4|

1:2

4:0

4|

1:2

0:0

8|

1:1

6:1

2|

1:1

5:4

6

segm

ent-

small-2

80601|4

00

9029|8

616|8

421|8

375|6

336

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-6.1

-8|-

6.4

-8|-

6.4

-8|-

6.5

-8|-

1.2

-74:2

3|4

:10|4

:02|3

:35|2

:22

segm

ent-

small-3

80601|4

00

10139|1

0195|1

0196|1

0196|1

0201

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-10.0

-8|-

9.8

-8|-

9.8

-8|-

9.8

-8|-

9.8

-85:0

1|5

:04|4

:59|4

:46|4

:26

segm

ent-

small-4

80601|4

00

4899|5

085|5

093|5

095|5

163

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.1

-8|-

7.4

-8|-

7.4

-8|-

7.4

-8|-

7.3

-82:2

3|2

:31|2

:29|2

:15|2

:14

segm

ent-

small-5

80601|4

00

19208|1

9763|1

9760|1

9756|1

9755

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|-

1.4

-7|-

1.4

-7|-

1.4

-7|-

1.4

-711:0

8|1

1:2

7|1

1:2

2|1

1:0

2|1

0:3

4

segm

ent-

small-6

80601|4

00

11858|1

2361|1

2394|1

2407|1

2408

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.9

-8|-

9.2

-8|-

9.4

-8|-

9.4

-8|-

9.4

-86:3

8|7

:00|6

:58|6

:47|6

:29

segm

ent-

small-7

80601|4

00

4378|4

544|4

551|4

554|5

137

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.5

-8|-

3.2

-8|-

3.3

-8|-

3.3

-8|-

2.5

-82:2

0|2

:27|2

:25|2

:26|2

:14

segm

ent-

small-8

80601|4

00

2869|3

041|3

060|3

062|5

585

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

-1.5

-7|-

2.2

-8|-

2.1

-8|-

2.1

-8|-

1.2

-81:2

2|1

:44|1

:46|1

:44|2

:44

segm

ent-

small-9

80601|4

00

1899|2

116|2

118|2

122|5

633

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.9

-7|-

3.5

-8|-

2.4

-8|-

2.4

-8|-

2.9

-953.7|1

:09|1

:09|2

:22|4

:47

segm

ent-

small-1

080601|4

00

3593|3

640|3

638|3

634|5

637

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.8

-8|-

2.3

-8|-

2.4

-8|-

2.3

-8|-

1.5

-82:4

9|2

:25|2

:23|2

:15|3

:03

segm

ent-

small-1

180601|4

00

3253|2

998|3

010|3

028|5

776

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|-

1.6

-8|-

1.6

-8|-

1.5

-8|-

9.0

-91:4

4|1

:50|1

:54|1

:47|3

:03

segm

ent-

mediu

m-2

246051|7

00

2629|1

899|1

738|1

701|5

913

9.8

-7|9

.7-7|9

.6-7|9

.5-7|9

.9-7

1.2

-6|-

1.2

-6|-

1.3

-6|-

1.2

-6|1

.3-9

5:0

8|3

:39|3

:27|3

:27|1

1:0

9

segm

ent-

mediu

m-3

246051|7

00

1016|8

85|8

80|8

86|5

399

9.4

-7|9

.7-7|9

.8-7|9

.9-7|9

.9-7

9.5

-7|4

.8-7|-

4.1

-7|-

2.9

-9|-

1.2

-91:5

4|1

:45|1

:47|1

:50|1

0:2

2

segm

ent-

mediu

m-4

246051|7

00

9440|9

369|9

373|9

375|7

804

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.2

-7|-

1.2

-7|-

1.2

-7|-

1.2

-7|-

1.5

-722:0

0|2

1:4

9|2

2:3

9|2

0:4

2|1

7:2

3

segm

ent-

mediu

m-5

246051|7

00

9655|9

823|9

832|9

838|7

695

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.2

-7|-

1.3

-7|-

1.3

-7|-

1.3

-7|-

1.4

-724:1

1|2

2:1

0|2

3:5

8|2

1:3

5|1

4:3

9

segm

ent-

mediu

m-6

246051|7

00

17891|1

7906|1

7914|1

7922|1

7923

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.7

-7|-

1.7

-7|-

1.7

-7|-

1.7

-7|-

1.7

-743:1

5|4

3:2

7|4

3:0

9|4

3:4

1|4

1:3

4

segm

ent-

mediu

m-7

246051|7

00

20706|1

9013|1

9029|1

9033|1

9035

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.8

-7|-

1.9

-7|-

1.9

-7|-

1.9

-7|-

1.9

-752:4

6|4

9:4

2|4

8:0

6|4

2:5

1|4

2:2

5

segm

ent-

mediu

m-8

246051|7

00

11948|1

0777|1

0795|1

0798|8

532

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.3

-7|-

1.3

-7|-

1.3

-7|-

1.3

-7|-

1.3

-727:0

9|2

4:3

8|2

4:3

0|2

3:4

2|1

6:1

5

segm

ent-

mediu

m-9

246051|7

00

5771|5

020|5

041|5

046|4

980

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.2

-8|-

9.3

-8|-

9.2

-8|-

8.8

-8|-

6.9

-812:0

4|1

0:3

1|1

0:3

8|1

0:0

8|9

:13

segm

ent-

mediu

m-1

0246051|7

00

3984|3

291|3

273|3

918|5

226

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-6.2

-8|-

1.3

-7|5

.7-9|-

3.4

-8|-

2.6

-87:5

3|6

:34|6

:32|7

:49|9

:36

segm

ent-

mediu

m-1

1246051|7

00

3140|2

466|2

284|2

244|5

577

9.7

-7|9

.6-7|9

.5-7|9

.9-7|9

.9-7

-4.0

-7|1

.6-7|3

.1-7|2

.3-7|-

5.0

-95:4

5|4

:32|4

:11|4

:08|1

0:0

9

segm

ent-

larg

e-2

501501|1

000

3159|2

178|1

986|1

949|5

143

9.8

-7|9

.8-7|9

.6-7|9

.7-7|9

.9-7

-1.6

-6|-

1.6

-6|1

.3-6|1

.3-6|-

4.9

-10

14:5

9|9

:12|7

:57|7

:47|1

9:4

6

segm

ent-

larg

e-3

501501|1

000

679|6

32|6

54|7

08|6

188

9.9

-7|9

.5-7|9

.1-7|9

.9-7|9

.9-7

9.5

-8|-

5.7

-7|-

5.5

-7|-

3.6

-7|-

1.5

-10

2:4

6|2

:36|2

:41|2

:53|2

3:5

9

segm

ent-

larg

e-4

501501|1

000

9271|1

1243|1

1262|1

1269|8

902

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.2

-7|-

1.3

-7|-

1.3

-7|-

1.3

-7|-

1.5

-744:0

3|5

4:2

1|5

3:5

3|5

3:0

9|4

1:0

9

segm

ent-

larg

e-5

501501|1

000

14070|1

4493|1

4494|1

4494|1

4494

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|-

1.4

-7|-

1.4

-7|-

1.4

-7|-

1.4

-71:1

5:5

6|

1:2

0:0

9|

1:2

0:4

2|

1:1

8:3

8|

1:1

4:3

7

segm

ent-

larg

e-6

501501|1

000

16373|1

6940|1

6968|1

6967|1

6966

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.6

-7|-

1.7

-7|-

1.7

-7|-

1.6

-7|-

1.6

-71:2

7:4

2|

1:2

6:5

7|

1:2

5:4

1|

1:2

0:3

9|

1:1

3:3

7

segm

ent-

larg

e-7

501501|1

000

23923|2

3568|2

3574|2

3569|2

3568

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.0

-7|-

2.0

-7|-

2.0

-7|-

2.0

-7|-

2.0

-72:0

1:0

6|

2:0

5:0

2|

2:1

6:5

5|

2:0

7:5

1|

2:0

0:4

1

segm

ent-

larg

e-8

501501|1

000

15689|1

4976|1

4989|1

4992|1

4994

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.5

-7|-

1.5

-7|-

1.5

-7|-

1.5

-7|-

1.5

-71:2

4:2

3|

1:2

1:0

0|

1:2

0:2

7|

1:1

4:1

0|

1:0

7:2

2

segm

ent-

larg

e-9

501501|1

000

8608|8

303|8

357|8

363|8

351

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-9.1

-8|-

9.8

-8|-

9.8

-8|-

9.8

-8|-

9.8

-842:5

3|4

1:3

1|4

1:3

3|4

0:2

7|3

6:4

4

segm

ent-

larg

e-1

0501501|1

000

3604|3

301|3

289|2

271|5

890

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-6.4

-8|-

2.3

-8|-

2.2

-8|-

3.2

-9|-

2.0

-816:1

8|1

5:2

1|1

5:2

1|9

:33|2

3:5

4

segm

ent-

larg

e-1

1501501|1

000

2977|2

286|2

115|2

142|5

836

9.7

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.4

-7|-

3.2

-7|-

5.6

-7|1

.2-7|-

1.2

-812:3

7|9

:29|8

:48|8

:52|2

3:4

5

housi

ng-2

128778|5

06

5125|4

096|3

781|3

662|4

636

9.4

-7|9

.6-7|9

.7-7|9

.9-7|9

.9-7

-1.9

-7|1

.9-7|-

1.8

-7|-

1.9

-7|2

.4-8

3:2

4|2

:44|2

:33|2

:27|2

:57

housi

ng-3

128778|5

06

2237|1

463|1

288|1

207|4

541

9.3

-7|9

.2-7|8

.4-7|9

.8-7|9

.9-7

-1.4

-8|-

3.1

-8|8

.4-9|2

.0-7|-

2.5

-11

1:3

1|1

:00|5

4.4|4

9.3|2

:54

housi

ng-4

128778|5

06

3580|2

118|1

810|1

732|6

077

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.9

-8|-

5.9

-8|-

6.0

-8|-

6.2

-8|2

.5-1

02:2

0|1

:23|1

:10|1

:08|3

:55

housi

ng-5

128778|5

06

2579|1

769|1

588|1

425|5

878

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|1

.7-7|2

.7-7|1

.1-7|1

.4-1

11:4

2|1

:11|1

:04|5

6.3|3

:50

housi

ng-6

128778|5

06

4044|2

462|2

091|1

997|5

430

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|-

1.4

-7|-

1.4

-7|-

1.4

-7|-

1.2

-10

2:5

3|1

:54|1

:39|1

:38|4

:06

housi

ng-7

128778|5

06

9120|5

332|4

525|4

317|5

176

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.0

-7|-

2.0

-7|-

2.0

-7|-

2.0

-7|-

6.7

-87:2

1|4

:08|3

:22|3

:01|3

:37

housi

ng-8

128778|5

06

9367|6

157|5

205|4

962|5

290

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.8

-7|-

1.8

-7|-

1.8

-7|-

1.8

-7|-

1.3

-76:4

1|4

:23|3

:43|3

:33|3

:47

housi

ng-9

128778|5

06

11344|7

514|6

312|6

007|6

052

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-9.1

-8|-

9.2

-8|-

9.5

-8|-

9.4

-8|-

8.4

-88:1

3|5

:25|4

:35|4

:20|4

:23

housi

ng-1

0128778|5

06

19551|1

1517|1

1021|1

0457|1

0403

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.7

-7|-

2.7

-7|-

2.7

-7|-

2.7

-7|-

2.7

-714:0

2|8

:16|8

:09|7

:56|8

:12

housi

ng-1

1128778|5

06

12674|7

443|6

320|6

042|6

031

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.8

-7|-

1.8

-7|-

1.8

-7|-

1.8

-7|-

1.7

-710:3

4|5

:52|5

:26|4

:46|4

:33

nonsy

m(5

,4)

3374|1

25

2241|2

140|2

093|2

114|2

086

9.6

-7|9

.1-7|9

.7-7|9

.1-7|9

.9-7

1.3

-5|-

1.2

-5|-

1.2

-5|-

1.1

-5|1

.2-5

4.3|3

.4|3

.3|3

.3|3

.3

47

Page 48: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

3(c

onti

nued)

itera

tion

ηSD

Pηgap

CP

Uti

me

pro

ble

mm|n

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

τ=

1|1.6

18|1.9

0|1.9

9|1.9

99

nonsy

m(6

,4)

9260|2

16

3684|3

801|3

746|3

769|3

765

9.9

-7|9

.8-7|9

.9-7|9

.4-7|9

.4-7

1.8

-5|-

7.1

-6|1

.7-5|1

.6-5|1

.6-5

14.4|1

4.9|1

4.8|1

5.3|1

5

nonsy

m(7

,4)

21951|3

43

5059|5

180|5

404|5

324|5

324

9.8

-7|9

.9-7|9

.4-7|9

.9-7|9

.8-7

-8.1

-6|-

9.0

-6|2

.3-5|-

2.3

-5|-

8.0

-647.4|4

7.8|4

9.8|4

9.6|4

9.3

nonsy

m(8

,4)

46655|5

12

8210|7

768|7

734|7

709|7

705

9.8

-7|9

.8-7|9

.9-7|9

.9-7|9

.8-7

-1.0

-5|-

2.9

-5|2

.9-5|-

2.9

-5|-

9.8

-64:2

8|4

:17|4

:11|4

:48|4

:44

nonsy

m(9

,4)

91124|7

29

10876|9

980|1

0751|1

0728|1

0704

9.9

-7|9

.9-7|9

.7-7|9

.8-7|9

.9-7

-2.7

-5|9

.1-6|-

8.0

-6|-

2.6

-5|-

2.6

-513:5

6|1

3:3

4|1

4:4

0|1

4:2

0|1

4:3

4

nonsy

m(1

0,4

)166374|1

000

30089|3

3786|3

3829|3

3820|3

3760

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-5|-

3.5

-5|-

3.5

-5|3

.4-5|-

1.1

-51:4

5:4

4|

1:5

2:4

8|

1:5

3:3

9|

1:5

1:5

1|

1:5

2:3

9

nonsy

m(1

1,4

)287495|1

331

17778|1

7487|1

7408|1

7733|1

7693

9.9

-7|9

.9-7|9

.9-7|9

.7-7|9

.8-7

-4.5

-5|-

1.5

-5|-

1.5

-5|4

.5-5|4

.5-5

1:4

7:4

4|

1:4

0:0

5|

1:4

3:5

7|

1:4

5:2

3|

2:1

0:4

2

nonsy

m(3

,5)

1295|8

12517|2

461|2

447|2

450|2

449

8.7

-7|9

.2-7|9

.7-7|8

.9-7|9

.6-7

-7.9

-6|-

7.3

-6|-

7.4

-6|7

.8-6|7

.7-6

1:3

0|1

:51|1

:15|1

:20|1

:44

nonsy

m(4

,5)

9999|2

56

12494|1

2149|1

2124|1

2082|1

2087

9.7

-7|9

.7-7|9

.9-7|9

.9-7|9

.8-7

-5.9

-6|1

.6-5|-

1.7

-5|1

.7-5|5

.5-6

5:5

6|5

:55|6

:59|7

:10|6

:15

nonsy

m(5

,5)

50624|6

25

8373|7

586|7

512|7

148|6

761

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7

1.1

-5|3

.1-5|3

.0-5|3

.0-5|-

1.1

-512:4

8|1

1:4

2|1

2:0

6|1

1:4

0|1

1:5

1

nonsy

m(6

,5)

194480|1

296

16298|1

5117|1

4606|1

4656|1

4735

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.5

-5|-

1.5

-5|-

4.5

-5|-

4.4

-5|1

.7-5

1:5

5:3

3|

1:3

8:5

8|

1:4

7:4

8|

1:3

3:3

6|

1:3

1:4

3

sym

-rd(3

,20)

10625|2

31

2117|1

748|1

633|1

646|1

645

8.1

-7|9

.4-7|9

.5-7|9

.8-7|9

.5-7

4.8

-7|1

.3-5|1

.0-5|4

.5-6|4

.0-6

16:5

7|1

3:3

7|1

2:4

2|1

2:5

5|1

2:4

9

sym

-rd(5

,5)

461|5

6262|2

51|2

53|2

50|2

49

9.9

-7|9

.7-7|3

.5-7|4

.2-7|4

.7-7

-7.6

-6|2

.6-5|2

.7-6|7

.0-6|4

.1-6

1.1|0

.9|0

.3|0

.7|0

.9

sym

-rd(5

,10)

8007|2

86

697|7

61|7

12|7

73|7

73

7.2

-7|8

.8-7|8

.4-7|7

.1-7|9

.0-7

2.5

-5|6

.4-6|3

.7-5|3

.7-5|4

.5-5

7:4

4|9

:01|6

:41|8

:56|8

:06

sym

-rd(6

,5)

209|3

5157|1

71|1

61|1

60|1

60

8.9

-7|4

.2-7|6

.2-7|7

.7-7|9

.3-7

1.1

-5|7

.9-6|2

.2-6|1

.6-5|1

.8-5

0.6|0

.3|0

.1|0

.2|0

.2

sym

-rd(6

,10)

5004|2

20

635|5

88|5

80|5

69|5

68

9.6

-7|8

.6-7|7

.4-7|6

.8-7|6

.9-7

4.5

-5|4

.5-5|4

.4-5|3

.3-5|3

.4-5

2:4

3|2

:18|2

:01|2

:14|2

:14

48

Page 49: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Tab

le4:

Th

ep

erfo

rman

ceof

7m

ult

i-b

lock

AD

MM

-typ

ealg

ori

thm

son

solv

ing

convex

qu

ad

rati

cS

DP

pro

ble

ms

(acc

ura

cy=

10−

6).

Th

em

axim

um

iter

ati

on

num

ber

isse

tas

500,0

00.

Inth

eta

ble

,‘D

-E’

mea

ns

the

dir

ectl

yex

tended

mult

i-blo

ckA

DM

Mw

ith

step

-len

gth

τ=

1,

P1.0

(P1.6

,P

1.9

)m

eans

the

blo

cksG

Sdec

om

posi

tion

base

dm

ult

i-blo

ckpro

xim

al

AD

MM

wit

hst

ep-l

ength

τ=

1(τ

=1.6

18,τ

=1.9

,et

c.),

andiP

1.0

(iP

1.6

,iP

1.9

,et

c.)

mea

ns

the

blo

cksG

Sdec

om

posi

tion

base

din

exact

mult

i-blo

ckA

DM

Mw

ith

step

-len

gth

τ=

1(τ

=1.6

18,τ

=1.9

,et

c.).

The

tim

eof

com

puta

tion

isin

the

form

at

of

”hou

rs:m

inu

tes:

seco

nd

s”if

itis

larg

erth

anon

em

inu

te,

or

else

itis

reco

rded

by

seco

nd

.

iteratio

nηSD

Pηgap

CP

Utim

e

proble

mm|nE|nI

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

be100.1

101;1

01;1

4850

26599|2

5983|2

0453|1

9877|6

585|5

190|4

912

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.7

-7|1

.8-7|8

.6-8|4

.8-8|-1.2

-6|-8.7

-7|-1.2

-61:1

1|1

:19|1

:01|1

:00|3

3.5|2

2.8|2

2

be100.2

101;1

01;1

4850

14616|1

4399|1

1747|1

1562|5

292|4

442|4

287

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.4

-7|3

.5-7|3

.4-7|3

.6-8|-9.7

-7|-4.8

-7|-7.0

-743.7|4

4.1|3

5.4|3

4.9|2

2|1

9.1|1

8.4

be100.3

101;1

01;1

4850

17930|1

7453|1

3747|1

1606|4

230|3

520|3

414

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.6

-7|-1.5

-7|-1.5

-7|-1.4

-7|-4.4

-7|-3.2

-7|-2.4

-752.3|5

0.5|4

0.6|3

4.1|1

8.5|1

5.5|1

4.8

be100.4

101;1

01;1

4850

28987|3

0096|2

3233|2

1379|6

856|5

214|4

916

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.8

-7|1

.8-7|1

.6-7|1

.6-7|-3.6

-7|-8.8

-7|-5.0

-71:2

4|1

:30|1

:08|1

:02|3

1.1|2

2.6|2

2.2

be100.5

101;1

01;1

4850

13962|1

5264|1

0192|1

0004|4

593|3

686|3

612

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

8.8

-7|1

.0-6|1

.0-6|9

.3-7|-4.2

-7|7

.1-8|-3.9

-741.7|4

6.3|3

0.9|3

0.8|1

9.2|1

6.2|1

5.9

be100.6

101;1

01;1

4850

18215|1

8685|1

4235|1

3690|5

209|4

049|4

287

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

6.5

-7|6

.5-7|6

.4-7|5

.7-7|-1.1

-6|-1.2

-6|-6.9

-753.1|5

4.3|4

2.1|4

0.2|2

2.4|1

8.1|1

9.2

be100.7

101;1

01;1

4850

20092|2

0280|1

5472|1

4825|5

281|4

113|3

952

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.8

-7|3

.1-7|2

.1-7|2

.4-7|-9.2

-7|-5.1

-7|-6.1

-759.6|5

9.5|4

5.9|4

3.4|2

3.1|1

8.3|1

7.7

be100.8

101;1

01;1

4850

21370|2

0910|1

6458|1

5425|7

231|5

203|4

617

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.8

-7|3

.9-7|4

.1-7|2

.9-7|-2.5

-7|-6.3

-7|-6.1

-71:0

3|1

:00|4

7.2|4

4.8|3

2.1|2

3.5|2

1

be100.9

101;1

01;1

4850

18780|2

0839|1

5095|1

3046|5

920|4

366|4

782

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.7

-7|1

.8-7|1

.8-7|1

.7-7|-3.1

-7|-3.8

-7|-3.1

-756.4|1

:04|4

5.9|4

0.2|2

5.6|1

9.7|2

1.6

be100.1

0101;1

01;1

4850

13409|1

2708|1

0008|9

959|4

480|3

685|3

170

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.1

-7|2

.5-7|3

.4-7|3

.7-7|-2.5

-7|-2.1

-7|5

.2-7

39.9|3

8.6|3

0.9|3

0.7|2

0.7|1

9.4|1

6.4

be120.3

.1121;1

21;2

1420

36536|3

3139|2

7803|2

5503|1

0934|7

218|7

601

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

4.4

-7|5

.2-7|4

.3-7|4

.1-7|-6.7

-7|-8.5

-7|-9.9

-72:2

2|2

:16|1

:53|1

:44|1

:06|4

4.5|4

7.5

be120.3

.2121;1

21;2

1420

22024|2

0823|1

7065|1

4387|6

362|5

239|5

933

9.8

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

6.5

-7|6

.2-7|2

.7-7|6

.9-7|-1.6

-7|-1.8

-7|-1.7

-71:2

8|1

:28|1

:12|5

9.8|3

5.5|2

9.5|3

4

be120.3

.3121;1

21;2

1420

30816|3

0566|2

2729|2

1785|6

505|5

025|4

417

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.1

-8|8

.2-9|-1.1

-9|3

.1-9|-8.1

-7|-8.4

-7|-7.9

-72:1

0|2

:14|1

:39|1

:35|3

7.4|2

9|2

5.9

be120.3

.4121;1

21;2

1420

32611|2

6705|2

1582|2

3609|8

907|7

260|7

300

9.9

-7|9

.7-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.0

-8|2

.0-7|2

.4-7|5

.8-8|-6.9

-7|-5.4

-7|-8.7

-82:0

9|1

:53|1

:30|1

:38|5

4.6|4

5.3|4

5.3

be120.3

.5121;1

21;2

1420

21167|2

1348|1

7614|1

6896|8

537|5

728|7

127

9.4

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

9.1

-8|6

.8-8|1

.9-8|-2.0

-9|-1.9

-7|-4.8

-7|-1.7

-71:2

5|1

:28|1

:13|1

:11|5

1.1|3

3.5|4

2.5

be120.3

.6121;1

21;2

1420

33348|3

3295|2

6053|2

3919|8

513|6

072|5

620

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.0

-7|-9.8

-8|-9.8

-8|-9.1

-8|-1.2

-7|-5.1

-7|-1.2

-62:1

2|2

:19|1

:49|1

:41|5

4.1|3

9|3

5.9

be120.3

.7121;1

21;2

1420

42467|4

2294|3

2274|2

9722|8

952|7

096|7

464

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

9.2

-8|9

.9-8|9

.6-8|8

.9-8|-6.7

-7|-5.3

-9|-5.0

-72:5

1|2

:57|2

:13|2

:04|4

9.8|4

3|4

4.7

be120.3

.8121;1

21;2

1420

18853|1

8306|1

5051|1

6448|7

598|5

444|5

999

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

8.5

-7|1

.0-6|5

.4-7|3

.7-8|-5.2

-7|-9.3

-7|-3.3

-71:1

6|1

:18|1

:05|1

:09|4

3.5|3

0.6|3

3.9

be120.3

.9121;1

21;2

1420

17200|1

7015|1

3458|1

2528|5

690|4

125|3

967

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.1

-6|1

.2-6|8

.8-7|1

.2-6|-3.8

-7|-3.6

-7|-3.8

-71:0

9|1

:09|5

5.2|5

1|3

1.9|2

4.5|2

3.6

be120.3

.10

121;1

21;2

1420

20854|2

0129|1

6858|1

5738|1

1939|7

416|8

737

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.4

-6|1

.3-6|1

.5-6|1

.4-6|-5.2

-7|-6.7

-7|-5.8

-71:2

4|1

:24|1

:11|1

:06|1

:08|4

2.4|4

9.8

be120.8

.1121;1

21;2

1420

14493|1

5499|1

1470|1

0034|4

996|4

104|3

919

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

9.2

-7|9

.8-7|9

.7-7|9

.9-7|-2.6

-7|-5.6

-8|-9.4

-858|1

:03|4

6.8|4

1.5|2

9.1|2

4.2|2

3.7

be120.8

.2121;1

21;2

1420

24822|2

4477|2

0548|1

8052|9

010|6

647|6

553

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.9

-7|3

.9-7|3

.2-7|2

.7-7|-4.0

-7|-3.6

-7|-3.8

-71:4

2|1

:44|1

:29|1

:17|5

3.7|4

0.5|3

9.4

be120.8

.3121;1

21;2

1420

16108|1

7178|1

3140|1

1817|5

080|4

289|4

401

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

5.0

-7|7

.2-7|3

.2-7|5

.5-7|-1.2

-6|-3.7

-7|-5.5

-71:0

5|1

:12|5

6.1|5

0.4|3

2.8|2

9.5|3

0.1

be120.8

.4121;1

21;2

1420

23850|2

3821|1

8243|1

6852|8

511|7

273|5

807

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.0

-6|1

.0-6|9

.9-7|9

.9-7|-6.7

-7|-4.4

-7|-1.0

-61:3

8|1

:41|1

:19|1

:11|4

8|4

2.8|3

3.6

be120.8

.5121;1

21;2

1420

17486|1

8030|1

4479|1

2872|5

917|4

206|4

473

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.7

-7|4

.6-7|3

.7-7|3

.5-7|-3.6

-7|-4.5

-7|-2.3

-71:1

2|1

:17|1

:01|5

4.2|3

2.7|2

3.7|2

5.7

be120.8

.6121;1

21;2

1420

16195|1

5975|1

2294|1

1529|5

417|4

386|4

017

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.2

-6|1

.2-6|8

.9-7|1

.2-6|-8.5

-7|-3.6

-7|-1.8

-71:0

7|1

:07|5

1.9|4

8.1|3

1.3|2

6|2

3.7

be120.8

.7121;1

21;2

1420

22803|2

4455|1

9984|1

7313|6

351|5

041|5

300

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

7.0

-8|7

.5-8|5

.2-8|4

.3-8|-4.2

-7|-2.9

-7|-5.9

-71:2

9|1

:41|1

:23|1

:12|3

5.2|2

7.7|2

9.3

be120.8

.8121;1

21;2

1420

18304|1

9368|1

5242|1

2962|5

493|4

389|4

208

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

8.3

-8|5

.8-8|8

.2-8|1

.2-7|-4.5

-7|-3.2

-7|-4.1

-71:1

4|1

:19|1

:05|5

5.9|3

2.2|2

6.6|2

5.4

be120.8

.9121;1

21;2

1420

17717|1

7698|1

4975|1

3722|4

543|3

928|4

201

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|8

.7-7

2.2

-8|8

.0-8|-6.7

-8|-4.2

-8|7

.9-7|5

.0-7|3

.9-7

1:1

5|1

:17|1

:05|5

9.3|2

5.7|2

4|2

5.3

be120.8

.10

121;1

21;2

1420

14772|1

6218|1

2679|1

0876|6

119|4

223|4

365

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.8

-6|1

.9-6|9

.1-7|1

.7-6|-2.3

-7|-2.7

-7|2

.7-7

1:0

1|1

:10|5

4.3|4

6.5|3

4.8|2

5.1|2

6

be150.3

.1151;1

51;3

3525

28554|2

9936|2

3256|2

2393|8

513|6

313|6

043

9.6

-7|9

.9-7|9

.1-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.0

-7|4

.7-7|3

.9-7|3

.9-7|-5.6

-7|-1.2

-6|-8.9

-72:5

4|3

:14|2

:34|2

:28|1

:22|1

:05|1

:00

be150.3

.2151;1

51;3

3525

20017|2

0358|1

5903|1

4062|6

680|5

961|5

577

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.9

-6|1

.9-6|1

.8-6|1

.6-6|-4.1

-7|-4.4

-7|-1.6

-71:5

8|2

:07|1

:40|1

:29|1

:02|5

5.4|5

1.7

be150.3

.3151;1

51;3

3525

17823|1

7904|1

3439|1

2084|4

531|3

913|4

286

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.6

-6|1

.6-6|1

.6-6|1

.5-6|-1.0

-6|9

.5-8|-1.2

-71:4

1|1

:49|1

:23|1

:13|3

9.6|3

4.5|3

7.8

be150.3

.4151;1

51;3

3525

42544|4

4008|3

3882|3

1582|1

2465|9

109|1

0139

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7|9

.9-7

1.0

-7|1

.0-7|1

.2-7|1

.1-7|-6.5

-7|-9.3

-7|-3.8

-73:5

1|4

:18|3

:20|3

:06|2

:24|1

:43|1

:55

be150.3

.5151;1

51;3

3525

29602|2

5107|2

1383|1

9189|7

754|5

014|5

531

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

8.1

-8|5

.8-7|3

.1-7|3

.9-7|-2.0

-7|-2.8

-7|-3.1

-72:5

4|2

:38|2

:17|2

:04|1

:18|5

3.5|5

8.6

be150.3

.6151;1

51;3

3525

23547|2

3575|1

9302|1

7081|5

190|4

414|4

375

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

1.3

-7|1

.8-7|7

.3-8|7

.2-8|4

.3-9|8

.1-8|2

.6-7

2:1

7|2

:24|1

:58|1

:44|4

3.4|3

9|4

0.6

be150.3

.7151;1

51;3

3525

34355|3

8137|2

5627|2

2630|1

3035|1

0054|9

920

9.5

-7|9

.7-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.9

-7|4

.3-7|4

.9-7|5

.3-7|-1.3

-6|-1.0

-6|-1.3

-63:2

9|4

:07|2

:42|2

:24|2

:06|1

:37|1

:36

be150.3

.8151;1

51;3

3525

28913|2

9932|2

2314|2

1019|7

357|6

038|5

815

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.6

-7|2

.5-7|2

.7-7|3

.4-7|-2.4

-7|5

.2-8|3

.7-8

2:5

1|3

:06|2

:22|2

:14|1

:08|5

6.9|5

4.2

be150.3

.9151;1

51;3

3525

16832|1

7394|1

3964|1

2810|5

428|4

803|4

210

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

8.3

-7|8

.1-7|6

.3-7|6

.7-7|-3.8

-7|-3.1

-7|-3.0

-81:3

2|1

:41|1

:22|1

:13|4

5.9|4

3|3

8.6

be150.3

.10

151;1

51;3

3525

30567|3

3360|2

0147|2

2104|1

0144|7

516|7

208

9.9

-7|9

.0-7|8

.9-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

3.9

-7|3

.1-7|8

.2-7|4

.8-7|-5.2

-7|-6.3

-7|-5.4

-73:0

1|3

:32|2

:08|2

:21|1

:37|1

:13|1

:08

be150.8

.1151;1

51;3

3525

22101|2

1781|1

7326|1

6353|7

068|5

587|5

528

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

6.6

-7|7

.3-7|4

.1-7|5

.2-7|-2.6

-8|-7.5

-8|-1.1

-72:0

6|2

:13|1

:46|1

:41|1

:06|5

3|5

3.8

be150.8

.2151;1

51;3

3525

18626|1

8037|1

4057|1

3687|6

735|5

412|4

746

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

5.3

-7|5

.8-7|4

.6-7|3

.1-7|6

.5-8|2

.9-8|-2.7

-71:4

6|1

:50|1

:28|1

:25|1

:02|5

0.6|4

7.8

be150.8

.3151;1

51;3

3525

36848|3

7270|2

8677|2

7414|9

596|7

847|7

371

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.1

-7|2

.1-7|1

.8-7|1

.7-7|-7.3

-7|-9.8

-7|-5.7

-73:4

7|4

:05|3

:10|3

:01|1

:34|1

:17|1

:12

be150.8

.4151;1

51;3

3525

34316|3

7312|2

5461|2

5550|1

0437|7

564|6

639

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.9

-7|2

.9-7|3

.0-7|2

.9-7|-2.1

-7|-2.8

-7|-2.1

-73:2

7|4

:01|2

:46|2

:44|1

:41|1

:13|1

:08

be150.8

.5151;1

51;3

3525

28012|2

8619|2

2723|2

1236|5

490|4

792|4

605

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.9

-7|4

.3-7|3

.2-7|3

.5-7|-3.0

-7|-5.0

-7|1

.0-6

2:4

9|3

:02|2

:27|2

:15|4

9.8|4

3.4|4

3.5

49

Page 50: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

4(contin

ued)

iteratio

nηSD

Pηgap

CP

Utim

e

proble

mm|nE|nI

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

be150.8

.6151;1

51;3

3525

24264|2

5728|1

9256|1

7622|6

491|4

839|4

795

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

6.4

-7|6

.2-7|5

.7-7|5

.6-7|-1.4

-6|-1.4

-6|-5.8

-72:2

2|2

:40|2

:02|1

:50|1

:08|5

1.8|5

2.3

be150.8

.7151;1

51;3

3525

31040|3

1624|2

4243|2

2476|7

344|5

761|5

421

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.4

-7|3

.6-7|3

.4-7|3

.2-7|-3.6

-8|-1.4

-7|-3.0

-73:1

2|3

:28|2

:40|2

:27|1

:11|5

5.1|5

2.6

be150.8

.8151;1

51;3

3525

27941|2

7247|2

0450|1

8710|9

339|5

521|6

209

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.4-7

1.2

-7|1

.7-7|9

.8-8|5

.1-8|-3.6

-7|-1.1

-6|-6.9

-72:5

1|2

:56|2

:13|2

:04|1

:33|5

3.7|1

:04

be150.8

.9151;1

51;3

3525

27997|2

8052|1

9827|1

8572|8

516|7

629|6

622

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.3

-7|1

.5-7|4

.1-7|3

.7-7|-3.5

-7|-2.8

-7|-3.2

-72:4

7|2

:59|2

:05|1

:59|1

:21|1

:13|1

:05

be150.8

.10

151;1

51;3

3525

23508|2

4124|1

9173|1

8945|6

724|5

294|5

205

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.3-7|9

.9-7|9

.9-7

3.3

-7|3

.9-7|2

.0-7|1

.7-7|-9.5

-7|-4.3

-7|-5.4

-72:2

2|2

:34|2

:04|2

:01|1

:05|5

3.1|5

2.5

be200.3

.1201;2

01;5

9700

29477|2

8767|2

2306|2

0318|6

960|5

912|6

226

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.6

-7|1

.9-7|1

.7-7|1

.1-7|-3.9

-7|-7.2

-7|8

.9-7

4:2

3|4

:41|3

:42|3

:24|1

:50|1

:35|1

:47

be200.3

.2201;2

01;5

9700

25578|2

5807|1

7126|1

5520|6

696|6

063|5

571

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

9.2

-7|9

.2-7|1

.4-6|1

.5-6|-6.6

-7|-7.5

-8|1

.0-8

3:4

7|4

:14|2

:44|2

:26|1

:43|1

:36|1

:29

be200.3

.3201;2

01;5

9700

30087|3

0635|2

4874|2

3386|9

016|7

579|6

902

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

7.7

-7|7

.8-7|4

.6-7|4

.1-7|-1.1

-6|-9.3

-7|-1.0

-64:3

5|5

:06|4

:13|3

:59|2

:22|2

:02|1

:51

be200.3

.4201;2

01;5

9700

46338|4

6575|3

7079|3

4330|9

908|6

601|6

075

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.2

-9|3

.1-8|-2.9

-9|-4.6

-9|-9.9

-7|-5.6

-7|-7.4

-77:2

9|8

:04|6

:26|5

:56|2

:58|2

:02|1

:52

be200.3

.5201;2

01;5

9700

32249|3

4880|2

7644|2

3057|7

131|5

928|5

954

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7

3.5

-7|2

.9-7|2

.3-7|2

.7-7|-5.4

-7|-3.6

-7|-2.3

-75:0

1|5

:57|4

:43|3

:51|1

:49|1

:36|1

:37

be200.3

.6201;2

01;5

9700

29111|2

8738|2

3565|2

1866|6

999|4

948|5

292

9.9

-7|9

.9-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

3.7

-7|5

.6-7|2

.1-7|1

.9-7|-1.0

-6|-5.7

-7|8

.7-8

4:2

6|4

:41|3

:54|3

:40|1

:44|1

:17|1

:24

be200.3

.7201;2

01;5

9700

35410|3

5805|2

8035|2

5610|1

4332|1

1053|9

554

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

5.8

-7|5

.5-7|4

.5-7|4

.9-7|-1.4

-7|-1.9

-7|-1.8

-75:2

7|5

:57|4

:49|4

:24|3

:52|3

:16|2

:38

be200.3

.8201;2

01;5

9700

43650|4

5249|3

7113|3

4005|1

0102|7

651|7

160

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.1

-7|1

.1-7|7

.1-8|6

.6-8|-9.8

-7|-7.2

-7|-8.7

-76:5

5|7

:54|6

:22|5

:49|2

:44|2

:02|1

:57

be200.3

.9201;2

01;5

9700

42442|4

3126|3

3676|3

0112|8

642|7

773|6

776

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

7.3

-8|9

.3-8|5

.2-8|5

.4-8|-2.3

-7|-6.4

-7|-2.5

-76:3

4|7

:17|5

:45|5

:06|2

:34|2

:21|2

:07

be200.3

.10

201;2

01;5

9700

21971|2

2213|2

1379|1

8907|6

456|4

963|5

803

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

5.6

-7|6

.2-7|5

.9-8|9

.2-8|-1.1

-6|-3.5

-8|-1.7

-83:1

6|3

:35|3

:31|3

:09|1

:41|1

:27|1

:45

be200.8

.1201;2

01;5

9700

41652|4

2441|3

1324|3

0128|1

3327|9

510|8

566

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.1

-7|2

.7-7|1

.5-7|1

.4-7|-3.2

-7|-5.4

-7|-3.9

-76:4

2|7

:27|5

:30|5

:17|3

:42|2

:37|2

:22

be200.8

.2201;2

01;5

9700

26502|2

6103|2

0051|1

8601|7

538|6

387|5

412

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.2

-7|4

.8-7|3

.8-7|4

.4-7|-2.3

-8|-6.3

-8|3

.1-7

3:5

8|4

:12|3

:17|3

:02|1

:56|1

:40|1

:29

be200.8

.3201;2

01;5

9700

37333|3

7889|2

8210|2

6797|1

1003|8

411|7

938

9.9

-7|9

.9-7|9

.8-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

2.3

-7|2

.4-7|2

.6-7|2

.2-7|-4.2

-7|-6.7

-7|-4.6

-75:4

5|6

:21|4

:43|4

:28|3

:03|2

:27|2

:21

be200.8

.4201;2

01;5

9700

32888|3

2062|2

5654|2

4081|7

527|7

312|6

056

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7|9

.9-7|9

.1-7

2.0

-7|2

.7-7|1

.7-7|1

.8-7|-5.0

-7|-4.0

-7|-7.0

-75:0

4|5

:19|4

:19|4

:04|2

:01|1

:59|1

:43

be200.8

.5201;2

01;5

9700

38925|3

8943|2

8609|2

6971|9

550|7

412|5

227

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

5.6

-8|7

.9-8|1

.0-7|9

.3-8|-4.7

-7|-4.8

-7|-6.1

-76:1

0|6

:39|4

:57|4

:27|2

:31|1

:57|1

:25

be200.8

.6201;2

01;5

9700

31948|3

2339|2

3412|2

2468|1

4473|9

133|9

961

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.6

-6|1

.7-6|1

.5-6|1

.5-6|-3.0

-7|-7.1

-7|-3.0

-74:4

4|5

:15|3

:49|3

:37|3

:56|2

:28|2

:44

be200.8

.7201;2

01;5

9700

37078|3

6472|2

9123|2

5885|8

613|7

075|6

355

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.8

-8|9

.3-8|4

.4-8|4

.2-8|-5.9

-7|5

.5-9|-3.7

-75:4

4|6

:08|5

:01|4

:27|2

:20|1

:55|1

:47

be200.8

.8201;2

01;5

9700

35815|3

7470|2

6962|2

4955|1

4921|1

2354|1

1543

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.0

-7|3

.1-7|3

.3-7|2

.0-7|-4.6

-7|-3.1

-7|-4.0

-75:2

4|6

:07|4

:22|4

:00|4

:27|3

:50|3

:36

be200.8

.9201;2

01;5

9700

24808|2

5601|1

9909|1

8903|8

739|6

843|5

944

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.2

-6|1

.3-6|7

.6-7|6

.8-7|-2.5

-7|9

.5-8|-2.3

-73:3

5|4

:00|3

:11|3

:03|2

:16|1

:50|1

:39

be200.8

.10

201;2

01;5

9700

25044|2

6022|1

9027|1

7467|7

755|5

432|5

219

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.7

-6|1

.7-6|1

.5-6|1

.5-6|-4.0

-7|-5.8

-7|6

.5-8

3:3

3|3

:57|2

:57|2

:40|1

:57|1

:24|1

:22

be250.1

251;2

51;9

3375

60594|5

9445|4

7902|4

4538|1

7487|1

3380|1

2119

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.3

-7|2

.3-7|2

.1-7|2

.2-7|-4.9

-7|-5.4

-7|-5.6

-713:5

6|1

5:1

4|1

2:2

2|1

1:2

9|7

:38|5

:53|5

:21

be250.2

251;2

51;9

3375

52532|5

6803|4

1194|3

8447|1

2725|9

050|9

634

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

6.1

-8|5

.0-8|5

.1-8|4

.2-8|-5.2

-7|-3.9

-7|-4.6

-712:1

4|1

4:3

0|1

0:3

9|9

:53|5

:28|3

:58|4

:17

be250.3

251;2

51;9

3375

48310|4

9846|4

0110|3

6528|2

2337|1

8339|1

6113

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.2

-6|1

.6-6|1

.1-6|1

.3-6|-4.6

-7|-3.3

-7|-5.5

-710:5

5|1

2:3

4|1

0:0

5|9

:10|9

:45|8

:07|7

:09

be250.4

251;2

51;9

3375

55860|5

7006|4

3507|4

1110|1

8580|1

5409|1

4453

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.5

-7|3

.8-7|3

.2-7|3

.4-7|-3.4

-7|-3.5

-7|-3.4

-713:0

5|1

4:4

2|1

1:1

7|1

0:4

2|8

:18|7

:00|6

:40

be250.5

251;2

51;9

3375

32841|3

3106|2

8314|2

6201|1

3476|8

600|1

0101

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

7.8

-7|1

.0-6|4

.2-7|3

.3-7|-2.7

-7|-2.8

-7|-2.5

-77:0

8|7

:56|7

:00|6

:29|5

:31|3

:48|4

:35

be250.6

251;2

51;9

3375

58449|5

9628|4

2132|4

1389|1

7968|1

3180|1

2357

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

9.3

-8|8

.9-8|1

.0-7|8

.2-8|-2.1

-7|-2.0

-7|-2.4

-713:2

9|1

5:1

8|1

0:4

8|1

0:3

7|7

:56|5

:57|5

:41

be250.7

251;2

51;9

3375

38639|3

7397|3

0589|2

7266|1

5412|1

2418|1

0112

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.2

-6|1

.3-6|1

.1-6|1

.1-6|-3.6

-7|-4.9

-7|-5.0

-78:4

1|9

:23|7

:45|6

:52|6

:40|5

:27|4

:30

be250.8

251;2

51;9

3375

58942|5

9801|4

0847|4

3010|1

8511|1

5377|1

4505

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.4

-7|3

.4-7|3

.1-7|3

.3-7|-3.9

-7|-3.4

-7|-3.5

-714:2

6|1

6:0

4|1

1:0

6|1

1:3

4|8

:10|6

:55|6

:31

be250.9

251;2

51;9

3375

32283|3

1800|2

5065|2

3395|8

254|6

464|7

088

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

7.3

-7|1

.1-6|5

.9-7|5

.2-7|5

.1-8|-3.2

-8|-7.8

-87:0

8|7

:41|6

:08|5

:43|3

:21|2

:42|3

:00

be250.1

0251;2

51;9

3375

42664|4

5466|3

2794|3

2014|1

5930|1

1483|1

1013

9.9

-7|9

.9-7|9

.6-7|9

.7-7|9

.9-7|9

.9-7|9

.9-7

9.1

-7|9

.0-7|1

.0-6|9

.6-7|-3.4

-7|-4.2

-7|-5.2

-79:3

6|1

1:2

2|8

:10|8

:05|6

:52|5

:01|4

:48

bqp50-1

51;5

1;3

675

14720|1

4287|1

1974|9

908|2

498|2

540|3

099

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-4.0

-7|-3.4

-7|-3.9

-7|-3.9

-7|-8.5

-7|-7.1

-7|-9.2

-718.7|1

9.8|1

7.3|1

3.7|4

.1|4

.2|5

.3

bqp50-2

51;5

1;3

675

18994|1

9368|1

4001|1

3567|9

276|5

145|4

195

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.5-7

-3.3

-7|-3.0

-7|-2.8

-7|-3.0

-7|-3.7

-8|-8.8

-8|-3.5

-723|2

6.2|1

9.5|1

8.8|1

5.4|8

.5|7

.2

bqp50-3

51;5

1;3

675

177243|6

8211|7

4341|5

6165|1

2257|9

752|1

1149

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.1

-7|-2.3

-7|5

.8-8|3

.8-7|-3.7

-7|-3.7

-7|-3.5

-73:4

2|1

:34|1

:43|1

:18|1

9.6|1

6.5|1

9.2

bqp50-4

51;5

1;3

675

157203|1

62674|1

30657|1

20852|2

8763|2

1304|2

0760

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7

-2.3

-7|-2.7

-7|-2.9

-7|-2.5

-7|-3.1

-7|-5.1

-7|-4.5

-73:2

9|3

:59|3

:11|3

:04|5

2.4|5

1.5|5

0

bqp50-5

51;5

1;3

675

109308|1

04266|7

9469|7

8615|1

9132|1

2946|1

3897

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7

-2.2

-7|-1.6

-7|-1.7

-7|-2.5

-7|-2.9

-7|-2.9

-7|-2.8

-72:5

6|2

:38|1

:56|1

:53|3

3.3|2

3.2|2

4.9

bqp50-6

51;5

1;3

675

184039|2

04810|2

32612|1

11458|3

4091|3

9132|4

0007

9.9

-7|9

.7-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

9.9

-8|6

.2-8|-6.0

-8|2

.0-7|4

.4-7|-5.4

-8|-2.0

-83:5

8|4

:58|5

:43|2

:43|5

5|1

:10|1

:13

bqp50-7

51;5

1;3

675

55655|5

6399|4

4561|3

9690|9

918|7

648|7

503

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.5

-7|-2.7

-7|-2.6

-7|-2.7

-7|-4.2

-7|-3.5

-7|-4.3

-71:1

2|1

:20|1

:03|5

6|1

8.1|1

3.8|1

3.8

bqp50-8

51;5

1;3

675

61129|7

7405|6

0239|5

9698|1

1146|8

679|1

2535

9.9

-7|9

.9-7|9

.9-7|9

.9-7|8

.8-7|9

.4-7|9

.9-7

2.4

-8|1

.5-8|-3.1

-8|-6.6

-8|-1.4

-7|-2.0

-7|-2.1

-71:2

1|1

:53|1

:29|1

:27|1

9.6|1

5.4|2

3.2

bqp50-9

51;5

1;3

675

77506|7

5610|5

8591|5

7791|1

3151|1

0742|1

2249

9.9

-7|9

.9-7|9

.8-7|9

.9-7|9

.8-7|9

.9-7|9

.9-7

-1.9

-7|-1.4

-7|-1.3

-7|-2.2

-7|-2.8

-7|-3.6

-7|-1.1

-71:4

3|1

:50|1

:25|1

:24|2

1|1

8.7|2

1.7

bqp50-1

051;5

1;3

675

47014|4

6988|3

7824|3

4880|1

0039|7

604|7

565

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.9

-7|-2.0

-7|-1.9

-7|-2.0

-7|-5.2

-7|-5.2

-7|-5.4

-71:0

1|1

:07|5

3.8|4

9.3|1

7|1

2.3|1

2.2

bqp100-1

101;1

01;1

4850

16118|1

8635|1

2822|1

1305|5

048|4

871|4

001

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.1

-6|6

.9-7|9

.0-7|9

.2-7|-5.4

-7|-3.1

-7|-6.4

-749.7|5

8.8|3

9.9|3

4.7|2

2.7|2

2.6|1

8.5

bqp100-2

101;1

01;1

4850

60256|5

8439|5

3590|4

7623|1

3614|1

1747|1

1130

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-8.9

-8|-1.1

-7|-7.3

-8|-7.3

-8|-1.3

-7|-1.2

-7|-1.3

-73:0

1|2

:58|2

:47|2

:28|1

:04|5

8.5|5

8.6

bqp100-3

101;1

01;1

4850

233230|2

31991|1

82150|1

69031|4

9066|3

7990|3

4805

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.8-7|8

.6-7

-9.4

-8|-8.8

-8|-8.1

-8|-6.7

-8|-4.1

-7|-4.8

-7|-4.4

-712:3

1|1

2:4

2|9

:51|9

:05|3

:59|3

:32|3

:16

bqp100-4

101;1

01;1

4850

40048|4

4442|3

2258|2

9492|9

772|7

448|7

715

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.3

-8|-1.4

-7|-2.5

-8|-1.1

-8|-1.1

-6|-1.2

-6|-1.2

-62:0

2|2

:18|1

:40|1

:32|4

5|3

4|3

5.6

bqp100-5

101;1

01;1

4850

53259|5

2618|4

4316|3

9413|7

204|5

785|6

226

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.7

-8|2

.1-8|-2.3

-8|-3.1

-8|-8.2

-7|-1.2

-6|-1.1

-62:4

5|2

:48|2

:24|2

:09|3

3.2|2

6.7|2

9.1

bqp100-6

101;1

01;1

4850

20692|2

1092|1

5795|1

5390|6

514|5

540|4

436

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-8|5

.0-8|2

.7-8|-2.3

-8|-1.6

-7|-3.6

-7|-9.0

-71:0

4|1

:06|4

9.3|4

8.2|2

9.5|2

5.6|1

9.9

bqp100-7

101;1

01;1

4850

49979|4

8122|3

9677|3

7153|9

132|6

657|7

358

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.3

-7|-2.4

-7|-2.4

-7|-2.3

-7|-1.3

-6|-9.0

-7|-1.4

-62:3

5|2

:32|2

:05|2

:00|4

1.9|3

2.9|3

4.9

50

Page 51: arXiv:1803.10803v2 [math.OC] 28 Jan 2019March 28, 2018; Revised on Jan 28, 2019 Abstract In this paper, we show that for a class of linearly constrained convex composite optimization

Table

4(contin

ued)

iteratio

nηSD

Pηgap

CP

Utim

e

proble

mm|nE|nI

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

D-E|P

1.0|P

1.6|P

1.9|iP

1.0|iP

1.6|iP

1.9

bqp100-8

101;1

01;1

4850

118113|1

17553|9

2922|8

5497|2

8883|2

3297|2

1405

9.9

-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.4

-8|-1.9

-9|-2.7

-8|-2.5

-8|-5.1

-7|-4.8

-7|-5.1

-76:0

8|6

:12|4

:50|4

:22|2

:15|1

:54|1

:54

bqp100-9

101;1

01;1

4850

80369|7

8808|6

4974|5

9661|1

5877|1

5069|1

3732

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-4.0

-8|-3.6

-8|-7.5

-8|-5.2

-8|-5.7

-7|-5.0

-7|-5.3

-74:0

7|4

:06|3

:20|3

:05|1

:11|1

:10|1

:06

bqp100-1

0101;1

01;1

4850

125337|1

25310|1

00168|9

1825|2

5485|1

9581|1

8711

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.9

-7|-1.9

-7|-1.8

-7|-1.8

-7|-5.8

-7|-5.8

-7|-6.8

-76:3

8|6

:44|5

:21|4

:50|1

:59|1

:47|1

:45

bqp250-1

251;2

51;9

3375

75186|6

2326|5

7991|5

4154|1

5516|1

0520|1

0715

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

8.1

-8|3

.9-7|7

.4-8|7

.4-8|-1.2

-6|-1.9

-6|-1.0

-618:2

1|1

6:3

3|1

5:3

6|1

4:3

9|7

:02|4

:56|5

:07

bqp250-2

251;2

51;9

3375

53456|4

5022|4

2776|3

8826|1

3881|1

1713|1

0640

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.7

-7|3

.7-7|1

.2-7|1

.2-7|-4.5

-7|-4.3

-7|-3.6

-712:2

4|1

1:2

0|1

0:5

1|9

:56|5

:57|5

:14|4

:57

bqp250-3

251;2

51;9

3375

51906|5

6050|3

9679|3

9434|1

7002|1

3824|1

2779

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.8

-7|4

.2-7|5

.2-7|4

.2-7|-4.0

-7|-3.8

-7|-3.9

-712:0

2|1

4:1

9|1

0:0

9|1

0:0

5|7

:22|6

:12|5

:43

bqp250-4

251;2

51;9

3375

40753|3

8314|3

3944|2

6351|1

0715|8

181|7

554

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.4-7|9

.9-7

2.3

-7|5

.2-7|1

.9-7|4

.7-7|-1.2

-6|-1.2

-6|-1.1

-69:0

4|9

:20|8

:30|6

:29|4

:42|3

:47|3

:40

bqp250-5

251;2

51;9

3375

53411|5

4251|4

4392|3

9628|1

6818|1

2554|1

2191

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.3

-7|1

.6-7|7

.7-8|9

.4-8|-2.8

-7|-2.9

-7|-2.6

-713:0

0|1

4:3

3|1

2:0

8|1

0:3

2|7

:44|6

:02|5

:59

bqp250-6

251;2

51;9

3375

40439|3

9815|3

0536|2

8314|8

210|7

204|7

103

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.8

-7|2

.2-7|1

.4-7|1

.5-7|-4.4

-7|-1.6

-7|-1.7

-79:1

7|1

0:0

3|7

:52|7

:13|3

:27|3

:08|3

:12

bqp250-7

251;2

51;9

3375

47536|4

9671|3

7014|3

6922|1

2214|9

303|7

175

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

3.7

-7|3

.7-7|3

.3-7|2

.9-7|-1.0

-6|-8.0

-7|-7.1

-711:1

6|1

2:4

1|9

:32|9

:33|5

:22|4

:12|3

:14

bqp250-8

251;2

51;9

3375

33602|3

4460|2

8002|2

5838|8

024|6

574|6

268

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.3

-7|4

.5-7|2

.9-7|3

.6-7|-2.5

-7|-1.3

-7|9

.0-7

7:1

7|8

:15|6

:48|6

:14|3

:17|2

:50|3

:01

bqp250-9

251;2

51;9

3375

54808|5

5226|4

1413|3

7864|1

6012|1

1409|1

0816

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.1

-7|1

.2-7|1

.3-7|1

.4-7|-6.0

-7|-6.6

-7|-5.5

-712:4

7|1

4:1

7|1

0:4

0|9

:43|6

:47|4

:59|4

:46

bqp250-1

0251;2

51;9

3375

33608|2

9942|2

8932|2

6666|9

832|7

716|7

302

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.3-7

6.2

-7|1

.7-6|3

.3-7|3

.7-7|-4.1

-7|-5.6

-7|-1.5

-77:1

9|7

:08|7

:10|6

:36|4

:00|3

:13|3

:10

bqp500-1

501;5

01;3

74250

500000|5

00000|5

00000|4

7251|2

0402|1

6524|1

3689

3.1

-4|2

.2-2|1

.4-2|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.9

-6|-5.3

-4|-4.1

-4|2

.8-7|-2.4

-8|-2.7

-8|-1.6

-710:3

7:2

0|13:0

2:0

3|12:3

8:3

0|1:1

6:4

3|4

8:5

7|4

1:4

7|3

6:1

6

bqp500-2

501;5

01;3

74250

500000|5

00000|5

00000|5

3886|2

6250|1

9700|1

8089

5.2

-3|1

.8-3|9

.3-3|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.2

-4|1

.5-4|3

.4-4|5

.5-7|-1.5

-7|-1.7

-7|-1.5

-710:4

5:5

2|12:5

1:4

2|12:5

2:0

6|1:3

0:1

4|1:0

3:1

6|4

9:3

3|4

7:2

9

bqp500-3

501;5

01;3

74250

500000|5

00000|5

00000|5

2209|2

4254|1

5924|1

1716

6.6

-4|2

.6-3|1

.0-2|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.8

-5|1

.1-4|3

.5-4|3

.1-7|-2.8

-7|-3.0

-7|-3.2

-710:5

4:1

4|12:4

9:2

8|13:0

3:3

9|1:3

0:3

1|1:0

2:0

8|4

1:1

3|3

1:2

9

bqp500-4

501;5

01;3

74250

500000|5

00000|6

5512|5

9933|1

8255|1

4448|1

2398

3.3

-3|2

.5-3|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.7

-4|-2.6

-4|2

.3-7|2

.1-7|-1.5

-6|-1.3

-6|-1.0

-610:5

0:4

1|13:0

8:5

2|1:5

2:4

0|1:4

4:0

9|4

3:2

2|3

6:0

5|3

1:0

6

bqp500-5

501;5

01;3

74250

500000|5

00000|5

00000|5

5816|2

0575|1

4599|1

5020

1.4

-3|2

.3-3|3

.0-3|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-10.0

-5|8

.0-5|-2.7

-5|1

.0-7|-2.4

-7|2

.5-8|-2.8

-710:5

5:1

4|13:2

5:0

2|13:0

8:5

6|1:3

2:5

2|4

9:4

0|3

6:5

6|3

8:2

5

bqp500-6

501;5

01;3

74250

500000|5

00000|5

00000|4

1561|2

2987|1

6783|1

4644

5.2

-4|7

.0-4|6

.5-3|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-6.8

-6|1

.5-5|2

.4-4|1

.3-6|-2.7

-7|-2.7

-7|-3.9

-711:2

0:0

3|13:0

3:0

8|13:0

8:1

8|1:0

5:4

1|5

5:4

3|4

3:1

9|3

8:1

4

bqp500-7

501;5

01;3

74250

500000|5

00000|5

00000|5

00000|1

4723|1

5152|9

156

1.1

-3|2

.0-3|1

.3-2|8

.8-3|9

.9-7|9

.9-7|9

.9-7

-6.0

-5|6

.9-5|-3.8

-4|-2.6

-4|-6.9

-7|-7.7

-7|-4.3

-710:5

9:4

2|13:1

4:2

6|13:1

6:2

2|13:1

5:2

8|3

6:3

3|3

8:0

3|2

4:1

9

bqp500-8

501;5

01;3

74250

500000|5

00000|5

00000|6

4442|2

4814|1

7420|1

1342

1.1

-3|2

.5-3|1

.3-2|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-4.4

-5|-8.4

-5|-3.4

-4|4

.1-7|-4.2

-7|-4.7

-7|-5.1

-711:2

9:1

2|13:3

3:2

8|13:5

5:4

2|1:4

9:5

6|1:0

3:0

1|4

5:5

8|3

0:0

6

bqp500-9

501;5

01;3

74250

500000|5

00000|5

00000|4

3377|1

9576|1

5101|1

1199

3.1

-4|5

.8-4|1

.0-2|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-5.0

-7|7

.6-5|2

.7-4|5

.0-7|-2.7

-7|-7.7

-8|-9.1

-811:2

0:2

5|13:4

9:3

9|14:0

3:4

4|1:1

5:4

3|5

1:0

9|4

2:0

1|3

2:2

7

bqp500-1

0501;5

01;3

74250

500000|5

00000|6

5291|5

9373|2

4430|1

9293|1

5617

2.6

-3|9

.9-4|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.0

-4|-1.4

-5|2

.2-7|1

.7-7|-3.3

-7|-3.5

-7|-4.2

-711:3

1:5

6|14:1

6:1

5|1:5

5:3

5|1:4

3:5

1|1:0

0:4

7|5

0:1

3|4

2:4

3

gka1a

51;5

1;3

675

34150|3

4737|2

7580|2

6284|4

475|4

398|5

040

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.9

-7|-1.8

-7|-2.1

-7|-2.1

-7|6

.4-8|1

.9-7|-4.2

-747.4|5

0.1|4

0|3

8.1|7

.9|8

.2|9

.3

gka2a

61;6

1;5

310

124262|7

0057|4

3222|4

9568|9

404|7

627|7

871

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.5

-7|3

.8-7|4

.1-7|-7.9

-8|-1.5

-7|-1.4

-7|-1.1

-73:0

4|1

:55|1

:11|1

:24|2

3.2|1

9.5|2

0.4

gka3a

71;7

1;7

245

31998|3

1107|2

7090|2

5067|6

936|5

774|5

147

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.7-7|9

.9-7

-1.5

-7|-1.2

-7|-1.9

-7|-1.4

-7|7

.0-8|-4.6

-7|-6.4

-755.4|1

:03|5

4.2|5

0.6|2

0.4|1

6.5|1

5.1

gka8a

101;1

01;1

4850

301490|5

00000|3

61738|3

13698|3

8872|3

1784|3

6337

9.9

-7|1

.8-6|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

9.9

-8|-1.4

-7|-3.3

-8|-4.7

-8|-2.9

-7|-2.6

-7|-2.8

-716:0

5|3

0:1

9|2

2:4

5|1

8:5

3|3

:30|3

:18|3

:40

gka9b

101;1

01;1

4850

7484|7

734|5

039|4

725|1

245|9

47|8

33

9.9

-7|9

.9-7|9

.4-7|9

.8-7|9

.9-7|9

.9-7|9

.9-7

-2.1

-7|-2.0

-7|-2.0

-8|-1.2

-8|-4.7

-7|-4.3

-7|-2.7

-723.6|2

5|1

5.8|1

5.3|8

.1|6|5

.4

gka10b

126;1

26;2

3250

13125|1

3033|1

0232|9

281|1

673|1

340|1

113

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.6

-7|-1.6

-7|-1.6

-7|-1.6

-7|-3.2

-7|-3.2

-7|-4.8

-752.8|5

7.2|4

5.3|4

1.8|1

3.2|1

0.8|8

.9

gka7c

101;1

01;1

4850

270591|2

70353|2

13183|1

99923|6

2652|5

0853|4

5771

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-9.4

-9|-2.6

-9|-1.7

-8|-5.1

-8|-4.5

-7|-4.6

-7|-4.5

-714:5

9|1

5:5

9|1

2:5

2|1

1:4

9|5

:42|4

:42|4

:39

gka1d

101;1

01;1

4850

153347|1

45059|1

17904|1

00814|2

6213|2

1022|1

9709

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.4

-7|-1.1

-7|-1.4

-7|-1.1

-7|-7.0

-7|-7.6

-7|-7.4

-78:2

5|8

:32|6

:43|5

:39|2

:09|1

:47|1

:46

gka2d

101;1

01;1

4850

12761|1

4458|1

1305|9

309|4

388|3

881|3

879

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

8.2

-7|9

.0-7|8

.8-7|1

.1-6|-2.2

-7|-4.5

-8|-1.8

-740.2|4

8.3|3

8.6|3

1.6|2

1.2|2

0|2

1.1

gka3d

101;1

01;1

4850

18783|2

0375|1

6135|1

4463|7

068|4

943|5

486

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

5.0

-7|3

.8-7|4

.2-7|3

.5-7|-3.4

-7|-2.8

-7|-2.9

-759.5|1

:07|5

3.5|4

7.8|3

5.7|2

7.7|3

0.5

gka4d

101;1

01;1

4850

17779|1

8959|1

5273|1

3901|5

723|4

644|4

610

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.7

-7|4

.4-7|4

.7-7|4

.8-7|-3.1

-7|-2.6

-7|-3.1

-759|1

:05|5

3|4

7.9|2

8|2

4.1|2

4.4

gka5d

101;1

01;1

4850

24810|2

4278|1

9334|1

7603|5

272|3

978|3

778

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.9

-8|4

.6-8|4

.6-8|3

.7-8|-2.4

-7|-1.9

-7|-4.6

-71:2

0|1

:23|1

:06|1

:01|2

6|2

0.5|1

9.1

gka6d

101;1

01;1

4850

31575|3

2317|2

4772|2

2905|8

156|5

738|5

792

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.6

-7|-1.7

-7|-1.5

-7|-1.6

-7|-5.7

-7|-8.9

-7|-1.0

-61:4

1|1

:49|1

:23|1

:17|4

1.1|2

8.4|2

9.1

gka7d

101;1

01;1

4850

14412|1

4697|1

1980|1

1477|5

310|4

247|4

341

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

7.0

-7|9

.8-7|8

.0-7|6

.6-7|-1.6

-6|-9.7

-7|-7.6

-745.8|5

0.1|4

1|3

9.3|2

6.9|2

1.7|2

2.5

gka8d

101;1

01;1

4850

18589|1

7501|1

3844|1

2181|6

633|4

798|4

523

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

4.7

-7|6

.7-7|7

.5-7|8

.4-7|-2.6

-7|-8.0

-7|-1.0

-659.4|5

8.8|4

7.4|4

1.6|3

3.6|2

4|2

2.8

gka9d

101;1

01;1

4850

17204|1

7719|1

2980|1

2321|5

517|4

538|4

410

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.4

-7|3

.1-7|2

.8-7|2

.6-7|-7.6

-7|-7.0

-7|-3.6

-755.5|1

:00|4

3.4|4

1.7|2

6.7|2

2.6|2

2.3

gka10d

101;1

01;1

4850

23462|2

1138|1

7548|1

5506|9

770|7

935|7

770

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-2.7

-8|-2.8

-7|-3.0

-7|-2.7

-7|-2.5

-7|-2.5

-7|-2.8

-71:1

5|1

:12|5

8.5|5

1.8|4

9.4|4

0.3|3

9.7

gka1e

201;2

01;5

9700

84099|8

4010|6

8161|6

1687|2

1129|1

7712|1

6953

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

8.9

-8|9

.6-8|1

.7-8|4

.2-8|-5.3

-7|-4.5

-7|-4.4

-713:3

1|1

5:0

2|1

2:2

2|1

1:0

4|6

:19|5

:45|5

:33

gka2e

201;2

01;5

9700

43286|4

3298|3

4608|3

1139|1

0368|8

001|7

451

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.5

-8|-1.3

-8|-4.7

-8|-4.9

-8|-2.1

-7|-2.2

-7|-2.6

-76:3

5|7

:22|5

:54|5

:23|2

:56|2

:16|2

:03

gka3e

201;2

01;5

9700

28104|2

7944|2

0565|1

9928|8

913|7

382|6

417

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.2

-6|1

.2-6|1

.2-6|1

.1-6|-2.9

-7|-2.8

-7|-3.3

-74:0

5|4

:35|3

:22|3

:15|2

:30|2

:08|1

:58

gka4e

201;2

01;5

9700

45967|4

6652|3

5948|3

4469|9

708|7

994|7

284

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

1.1

-7|1

.0-7|1

.2-7|1

.1-7|-1.1

-6|-9.3

-7|-8.9

-77:2

4|8

:20|6

:28|6

:10|2

:49|2

:18|2

:07

gka5e

201;2

01;5

9700

41178|3

9322|3

5176|3

2979|9

756|6

259|6

181

9.9

-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7|9

.9-7

2.3

-7|2

.9-7|1

.2-7|1

.1-7|-3.1

-7|-6.4

-7|-3.6

-76:2

6|6

:55|6

:13|5

:50|2

:52|1

:50|1

:52

gka1f

501;5

01;3

74250

500000|5

00000|5

00000|3

6996|2

0529|1

3666|1

2717

1.0

-3|1

.1-3|8

.6-3|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-7.1

-5|1

.3-5|4

.0-4|1

.4-6|-2.4

-7|-2.4

-7|-7.0

-711:2

7:0

3|13:4

1:1

7|14:0

2:2

4|1:0

2:3

3|5

2:0

2|3

6:2

7|3

4:3

0

gka2f

501;5

01;3

74250

500000|5

00000|5

00000|5

1262|1

8588|1

1922|1

1993

3.8

-3|2

.0-3|5

.1-3|9

.9-7|9

.9-7|9

.9-7|9

.6-7

-3.9

-5|2

.4-4|2

.7-4|1

.7-7|-3.8

-7|2

.5-8|5

.8-8

11:5

8:4

2|14:0

3:1

0|14:0

2:0

0|1:3

4:2

4|4

9:3

5|3

2:4

1|3

3:0

0

gka3f

501;5

01;3

74250

500000|5

00000|5

00000|5

1268|1

6420|1

4608|1

2985

2.0

-3|2

.1-3|5

.2-3|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.1

-5|-2.0

-5|8

.3-5|2

.3-7|-1.1

-8|-1.9

-7|-2.5

-712:0

9:5

8|14:4

1:3

9|14:1

4:5

3|1:3

2:5

3|4

3:0

7|3

9:2

7|3

6:4

1

gka4f

501;5

01;3

74250

500000|5

00000|5

00000|5

6339|1

8917|1

4679|1

1215

6.4

-3|1

.7-3|6

.7-3|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-3.5

-4|-1.7

-4|9

.1-5|9

.0-8|-8.4

-7|-9.3

-7|-1.0

-611:4

6:3

8|14:4

1:0

2|14:0

4:5

1|1:4

3:4

6|4

8:1

1|3

8:4

7|3

0:5

9

gka5f

501;5

01;3

74250

500000|5

00000|5

00000|4

7188|2

3314|1

8453|1

6909

1.1

-3|2

.0-3|2

.3-2|9

.9-7|9

.9-7|9

.9-7|9

.9-7

-1.7

-5|5

.0-6|6

.2-4|7

.2-7|-1.6

-7|-1.6

-7|-1.7

-711:4

1:1

4|14:0

7:2

8|14:1

9:3

9|1:2

1:2

1|1:0

0:3

1|4

9:4

8|5

1:0

1

51