stochastic proximal algorithms with applications to online ... pesquet.pdf · introduction...
TRANSCRIPT
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
1/24
Stochastic Proximal Algorithms withApplications to Online Image Recovery
Patrick Louis Combettes1 and Jean-Christophe Pesquet2
1 Mathematics Department, North Carolina State University,Raleigh, USA
2 Center for Visual Computing, CentraleSupelec, University Paris-Saclay,Grande Voie des Vignes, 92295 Chatenay-Malabry, France
S3 Seminar - 24 March 2017
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
2/24
Outline
1. Introduction
2. Stochastic Forward-Backward
3. Monotone Inclusion Problems
4. Primal-Dual Extension
5. Application
6. Conclusion
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
3/24
Context
Need for fast optimization methods over the last decade
Why?
I Interest in nonsmooth cost functions (sparsity)I Need for optimal processing of massive datasets (big data) large number of variables (inverse problems) large number of observations (machine learning)
I Use of more sophisticated data structures(graph signal processing)
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
3/24
Context
Need for fast optimization methods over the last decade
Why?
I Interest in nonsmooth cost functions (sparsity)I Need for optimal processing of massive datasets (big data) large number of variables (inverse problems) large number of observations (machine learning)
I Use of more sophisticated data structures(graph signal processing)
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
4/24
Variational formulation
GOAL:
minimizex∈H
f(x) + h(x),
where• H: signal space (real Hilbert space)• f ∈ Γ0(H): class of convex lower-semicontinuous functions fromH to ]−∞,+∞] with a nonempty domain• h : H→ R: differentiable convex function such that ∇h isϑ−1-Lipschitz continuous with ϑ ∈ ]0,+∞[
• F = Argmin(f + h) assumed to be nonempty.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
5/24
Algorithm
CLASSICAL SOLUTION [Combettes and Wajs - 2005]
(∀n ∈ N) xn+1 = xn + λn(proxγnf(xn − γn∇h(xn))− xn
),
FORWARD-BACKWARD ALGORITHM
where λn ∈]0, 1], γn ∈ ]0, 2ϑ[, and proxγnf is the proximity operatorof γnf [Moreau - 1965]:
proxγnf : x 7→ argminy∈H
f(y) +1
2γn‖x− y‖2.
SPECIAL CASES: projected gradient method, iterative soft threshold-ing, Landweber algorithm,...
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
5/24
Algorithm
CLASSICAL SOLUTION [Combettes and Wajs - 2005]
(∀n ∈ N) xn+1 = xn + λn(proxγnf(xn − γn∇h(xn))− xn
),
FORWARD-BACKWARD ALGORITHM
In the context of online processing and machine learning,what to do if ∇h and f are not known exactly ?
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
6/24
Proposed Solution
(∀n ∈ N) xn+1 = xn + λn(proxγnfn(xn − γnun) + an − xn
),
STOCHASTIC FB ALGORITHM
where• λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
6/24
Proposed Solution
(∀n ∈ N) xn+1 = xn + λn(proxγnfn(xn − γnun) + an − xn
),
STOCHASTIC FB ALGORITHM
where• λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
• fn ∈ Γ0(H): approximation to f
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
6/24
Proposed Solution
(∀n ∈ N) xn+1 = xn + λn(proxγnfn(xn − γnun) + an − xn
),
STOCHASTIC FB ALGORITHM
where• λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
• fn ∈ Γ0(H): approximation to f
• un second-order random variable: approximation to ∇h(xn)
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
6/24
Proposed Solution
(∀n ∈ N) xn+1 = xn + λn(proxγnfn(xn − γnun) + an − xn
),
STOCHASTIC FB ALGORITHM
where• λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
• fn ∈ Γ0(H): approximation to f
• un second-order random variable: approximation to ∇h(xn)
• an second-order random variable: possible additional errorterm.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
7/24
Assumptions
Let X = (Xn)n∈N be a sequence of sigma-algebras such that
(∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1.
where σ(x0, . . . , xn) is the smallest σ-algebra generated byx0, . . . , xn.
`+(X ): set of sequences of [0,+∞[-valued random variables(ξn)n∈N such that (∀n ∈ N) ξn is Xn-measurable and
`1+(X ) =
{(ξn)n∈N ∈ `+(X )
∣∣∣ ∑n∈N
ξn < +∞ P-a.s.
}
`∞+ (X ) =
{(ξn)n∈N ∈ `+(X )
∣∣∣ supn∈N
ξn < +∞ P-a.s.}.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
7/24
AssumptionsLet X = (Xn)n∈N be a sequence of sigma-algebras such that
(∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1.
where σ(x0, . . . , xn) is the smallest σ-algebra generated byx0, . . . , xn.
Assumptions on the gradient approximation:
I∑
n∈N√λn‖E(un |Xn)−∇h(xn)‖ < +∞.
I For every z ∈ F, there exist sequences (τn)n∈N ∈ `+,(ζn(z))n∈N ∈ `∞+ (X ) such that
∑n∈N
√λnζn(z) < +∞
and
(∀n ∈ N) E(‖un − E(un |Xn)‖2 |Xn)
6 τn‖∇h(xn)−∇h(z)‖2 + ζn(z).
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
7/24
Assumptions
Let X = (Xn)n∈N be a sequence of sigma-algebras such that
(∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1.
where σ(x0, . . . , xn) is the smallest σ-algebra generated byx0, . . . , xn.
Assumptions on the prox approximation:
I There exist sequences (αn)n∈N and (βn)n∈N in [0,+∞[such that
∑n∈N√λnαn < +∞,
∑n∈N λnβn < +∞, and
(∀n ∈ N)(∀x ∈ H) ‖proxγnfnx−proxγnfx‖ 6 αn‖x‖+βn.
I∑
n∈N λn√E(‖an‖2 |Xn) < +∞.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
7/24
Assumptions
Let X = (Xn)n∈N be a sequence of sigma-algebras such that
(∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1.
where σ(x0, . . . , xn) is the smallest σ-algebra generated byx0, . . . , xn.
Assumptions on the algorithm parameters:
I infn∈N γn > 0, supn∈N τn < +∞, andsupn∈N(1 + τn)γn < 2ϑ.
I Either infn∈N λn > 0 or[γn ≡ γ,
∑n∈N τn < +∞, and∑
n∈N λn = +∞].
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
8/24
Convergence Result
Under the previous assumptions, the sequence (xn)n∈N gen-erated by the algorithm converges weakly a.s. to an F-valuedrandom variable.
REMARKS:? Related works: [Rosasco et al. - 2014, Atchade et al. - 2016]
? Result valid for non vanishing step sizes (γn)n∈N.? We do not need to assume that
(∀n ∈ N) E(un |Xn) = ∇h(xn).
? Proof based on properties of stochastic quasi-Fejer sequences[Combettes and Pesquet – 2015, 2016].
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
9/24
Stochastic Quasi-Fejer Sequences
I Let φ : [0,+∞[→ [0,+∞[, φ(t) ↑ +∞ as t→ +∞I Deterministic definition: A sequence (xn)n∈N in H is Fejer
monotone with respect to F if for every z ∈ F,
(∀n ∈ N) φ(‖xn+1 − z‖) 6 φ(‖xn − z‖)
Suppose (xn)n∈N is stochastically quasi-Fejer monotone w.r.t.F. ThenI (∀z ∈ F)
[ ∑n∈N ϑn(z) < +∞ P-a.s.
]I [W(xn)n∈N ⊂ F P-a.s.]⇔ [(xn)n∈N converges weakly
P-a.s. to an F-valued random variable].
W(xn)n∈N: set of weak sequential cluster points of (xn)n∈N.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
9/24
Stochastic Quasi-Fejer Sequences
I Let φ : [0,+∞[→ [0,+∞[, φ(t) ↑ +∞ as t→ +∞I Stochastic definition 1: A sequence (xn)n∈N of H-valued
random variables is stochastically Fejer monotone withrespect to F if, for every z ∈ F,
(∀n ∈ N) E(φ(‖xn+1 − z‖)| Xn) 6 φ(‖xn − z‖)
Suppose (xn)n∈N is stochastically quasi-Fejer monotone w.r.t.F. ThenI (∀z ∈ F)
[ ∑n∈N ϑn(z) < +∞ P-a.s.
]I [W(xn)n∈N ⊂ F P-a.s.]⇔ [(xn)n∈N converges weakly
P-a.s. to an F-valued random variable].
W(xn)n∈N: set of weak sequential cluster points of (xn)n∈N.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
9/24
Stochastic Quasi-Fejer Sequences
I Let φ : [0,+∞[→ [0,+∞[, φ(t) ↑ +∞ as t→ +∞I Stochastic definition 2: A sequence (xn)n∈N of H-valued
random variables is stochastically quasi-Fejer monotonewith respect to F if, for every z ∈ F, there exist(χn(z))n∈N ∈ `1+(X ), (ϑn(z))n∈N ∈ `+(X ), and(ηn(z))n∈N ∈ `1+(X ) such that
(∀n ∈ N) E(φ(‖xn+1−z‖) |Xn)+ϑn(z) 6 (1+χn(z))φ(‖xn−z‖)+ηn(z)
Suppose (xn)n∈N is stochastically quasi-Fejer monotone w.r.t.F. ThenI (∀z ∈ F)
[ ∑n∈N ϑn(z) < +∞ P-a.s.
]I [W(xn)n∈N ⊂ F P-a.s.]⇔ [(xn)n∈N converges weakly
P-a.s. to an F-valued random variable].
W(xn)n∈N: set of weak sequential cluster points of (xn)n∈N.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
9/24
Stochastic Quasi-Fejer Sequences
I Let φ : [0,+∞[→ [0,+∞[, φ(t) ↑ +∞ as t→ +∞I Stochastic definition 2: A sequence (xn)n∈N of H-valued
random variables is stochastically quasi-Fejer monotonewith respect to F if, for every z ∈ F, there exist(χn(z))n∈N ∈ `1+(X ), (ϑn(z))n∈N ∈ `+(X ), and(ηn(z))n∈N ∈ `1+(X ) such that
(∀n ∈ N) E(φ(‖xn+1−z‖) |Xn)+ϑn(z) 6 (1+χn(z))φ(‖xn−z‖)+ηn(z)
Suppose (xn)n∈N is stochastically quasi-Fejer monotone w.r.t.F. ThenI (∀z ∈ F)
[ ∑n∈N ϑn(z) < +∞ P-a.s.
]I [W(xn)n∈N ⊂ F P-a.s.]⇔ [(xn)n∈N converges weakly
P-a.s. to an F-valued random variable].
W(xn)n∈N: set of weak sequential cluster points of (xn)n∈N.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
10/24
More General ProblemGOAL:
Find x ∈ H such that 0 ∈ Ax + Bx,
where• A : H→ 2H: maximally monotone operator, i.e.
(x, u) ∈ graA ⇔ (∀(y, v) ∈ graA) 〈x− y | u− v〉 > 0.
• If A is maximally monotone, then its resolvent JA = (Id + A)−1 isa firmly nonexpansive operator from H to H.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
10/24
More General ProblemGOAL:
Find x ∈ H such that 0 ∈ Ax + Bx,
where• A : H→ 2H: maximally monotone operator, i.e.
(x, u) ∈ graA ⇔ (∀(y, v) ∈ graA) 〈x− y | u− v〉 > 0.
• B : H→ H: ϑ-cocoercive operator, with ϑ ∈ ]0,+∞[, i.e.
(∀x ∈ H)(∀y ∈ H) 〈x− y | Bx− By〉 > ϑ‖Bx− By‖2,
• F = zer (A + B) assumed to be nonempty.
EXAMPLE: A = ∂f with f ∈ Γ0(H) and B = ∇h with h convex witha ϑ−1-Lipschitzian gradient.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
11/24
Proposed Solution
(∀n ∈ N) xn+1 = xn + λn(JγnAn(xn − γnun) + an − xn
),
STOCHASTIC FB ALGORITHM
where• λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
11/24
Proposed Solution
(∀n ∈ N) xn+1 = xn + λn(JγnAn(xn − γnun) + an − xn
),
STOCHASTIC FB ALGORITHM
where• λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
• JγnAn : resolvent of a maximally monotone operatorγnAn : H→ 2H approximating γnA
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
11/24
Proposed Solution
(∀n ∈ N) xn+1 = xn + λn(JγnAn(xn − γnun) + an − xn
),
STOCHASTIC FB ALGORITHM
where• λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
• JγnAn : resolvent of a maximally monotone operatorγnAn : H→ 2H approximating γnA• un second-order random variable: approximation to Bxn
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
11/24
Proposed Solution
(∀n ∈ N) xn+1 = xn + λn(JγnAn(xn − γnun) + an − xn
),
STOCHASTIC FB ALGORITHM
where• λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
• JγnAn : resolvent of a maximally monotone operatorγnAn : H→ 2H approximating γnA• un second-order random variable: approximation to Bxn
• an second-order random variable: possible additional errorterm
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
12/24
Convergence ConditionsLet X = (Xn)n∈N be a sequence of sigma-algebras such that
(∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1.
where σ(x0, . . . , xn) is the smallest σ-algebra generated byx0, . . . , xn.
Assumptions on the approximation to the cocoercive operator:
I∑
n∈N√λn‖E(un |Xn)− Bxn‖ < +∞.
I For every z ∈ F, there exist sequences (τn)n∈N ∈ `+,(ζn(z))n∈N ∈ `∞+ (X ) such that
∑n∈N
√λnζn(z) < +∞
and
(∀n ∈ N) E(‖un − E(un |Xn)‖2 |Xn)
6 τn‖Bxn − Bz‖2 + ζn(z).
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
12/24
Convergence Conditions
Let X = (Xn)n∈N be a sequence of sigma-algebras such that
(∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1.
where σ(x0, . . . , xn) is the smallest σ-algebra generated byx0, . . . , xn.
Assumptions on the resolvent approximation:
I There exist sequences (αn)n∈N and (βn)n∈N in [0,+∞[such that
∑n∈N√λnαn < +∞,
∑n∈N λnβn < +∞, and
(∀n ∈ N)(∀x ∈ H) ‖JγnAnx − JγnAx‖ 6 αn‖x‖ + βn.
I∑
n∈N λn√E(‖an‖2 |Xn) < +∞.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
12/24
Convergence Conditions
Let X = (Xn)n∈N be a sequence of sigma-algebras such that
(∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1.
where σ(x0, . . . , xn) is the smallest σ-algebra generated byx0, . . . , xn.
Assumptions on the algorithm parameters:
I infn∈N γn > 0, supn∈N τn < +∞, andsupn∈N(1 + τn)γn < 2ϑ.
I Either infn∈N λn > 0 or[γn ≡ γ,
∑n∈N τn < +∞, and∑
n∈N λn = +∞].
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
13/24
Convergence Result
Under the previous assumptions, the sequence (xn)n∈N gen-erated by the algorithm converges weakly a.s. to an F-valuedrandom variable.
In addition if A or B is demiregular at every z ∈ F, then the se-quence (xn)n∈N generated by the algorithm converges stronglya.s. to an F-valued random variable.
A is demiregular at x ∈ domA if, for every sequence (xn, un)n∈Nin graA and every u ∈ Ax such that xn ⇀ x and un → u, we havexn → x.Example: A strongly monotone, i.e. there exists α ∈ ]0,+∞[such that A− αId is monotone.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
14/24
Primal-Dual Splitting
GOAL:
minimizex∈H
f(x) +
q∑k=1
gk(Lkx) + h(x)
where• H: real Hilbert space• f ∈ Γ0(H)
• h : H→ R: differentiable convex function with ϑ−1-Lipschitzcontinuous gradient• gk ∈ Γ0(Gk) with Gk real Hilbert space• Lk: bounded linear operator from H to Gk
• ∃x ∈ H such that 0 ∈ ∂f(x) +∑q
k=1 L∗k∂gk(Lkx) +∇h(x).
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
15/24
Reformulation
LetI K = H⊕ G with G = G1 ⊕ · · · ⊕ GqI g : G→ ]−∞,+∞] : v 7→
∑qk=1 gk(vk)
I L : H→ G : x 7→(Lkx)16k6q
I A : K→ 2K : (x, v) 7→(∂f(x) + L∗v
)×(− Lx + ∂g∗(v)
)I B : K→ K : (x, v) 7→
(∇h(x), 0
)I V : K→ K : (x, v) 7→
(ρ−1x− L∗v,−Lx + U−1v
)with
U = Diag(σ1Id , . . . , σqId ) with (ρ, σ1, . . . , σq) ∈ ]0,+∞[q+1
and ρ∑q
k=1 σk‖Lk‖2 < 1.
In the renormed space (K, ‖ · ‖V), V−1A is maximally monotoneand V−1B is cocoercive. In addition, finding a zero of the sumof these operators is equivalent to finding a pair of primal-dualsolutions.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
16/24
Resulting Algorithm
for n = 0, 1, . . .yn = proxρfn
(xn − ρ
( q∑k=1
L∗kvk,n + un
))+ bn
xn+1 = xn + λn(yn − xn)for k = 1, . . . , q⌊wk,n = proxσkg∗k
(vk,n + σkLk(2yn − xn)
)+ ck,n
vk,n+1 = vk,n + λn(wk,n − vk,n).
STOCHASTIC PRIMAL-DUAL ALGORITHM
where• λn ∈ ]0, 1] with
∑n∈N λn = +∞ and(
ρ−1 −∑q
k=1 σk‖Lk‖2)ϑ > 1/2
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
16/24
Resulting Algorithm
for n = 0, 1, . . .yn = proxρfn
(xn − ρ
( q∑k=1
L∗kvk,n + un
))+ bn
xn+1 = xn + λn(yn − xn)for k = 1, . . . , q⌊wk,n = proxσkg∗k
(vk,n + σkLk(2yn − xn)
)+ ck,n
vk,n+1 = vk,n + λn(wk,n − vk,n).
STOCHASTIC PRIMAL-DUAL ALGORITHM
where• fn ∈ Γ0(H): approximation to f
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
16/24
Resulting Algorithm
for n = 0, 1, . . .yn = proxρfn
(xn − ρ
( q∑k=1
L∗kvk,n + un
))+ bn
xn+1 = xn + λn(yn − xn)for k = 1, . . . , q⌊wk,n = proxσkg∗k
(vk,n + σkLk(2yn − xn)
)+ ck,n
vk,n+1 = vk,n + λn(wk,n − vk,n).
STOCHASTIC PRIMAL-DUAL ALGORITHM
where• un second-order random variable: approximation to ∇h(xn)
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
16/24
Resulting Algorithm
for n = 0, 1, . . .yn = proxρfn
(xn − ρ
( q∑k=1
L∗kvk,n + un
))+ bn
xn+1 = xn + λn(yn − xn)for k = 1, . . . , q⌊wk,n = proxσkg∗k
(vk,n + σkLk(2yn − xn)
)+ ck,n
vk,n+1 = vk,n + λn(wk,n − vk,n).
STOCHASTIC PRIMAL-DUAL ALGORITHM
where• bn and cn second-order random variables: possible
additional error terms
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
17/24
Resulting Algorithm
for n = 0, 1, . . .yn = proxρfn
(xn − ρ
( q∑k=1
L∗kvk,n + un
))+ bn
xn+1 = xn + λn(yn − xn)for k = 1, . . . , q⌊wk,n = proxσkg∗k
(vk,n + σkLk(2yn − xn)
)+ ck,n
vk,n+1 = vk,n + λn(wk,n − vk,n).
STOCHASTIC PRIMAL-DUAL ALGORITHM
REMARKS:? Extension of the deterministic algorithms in
[Esser et al – 2010] [Chambolle and Pock – 2011] [Vu – 2013] [Condat – 2013]
? Parallel structure? No inversion of operators related to (Lk)16k6q required.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
18/24
Assumptions
Let X = (Xn)n∈N be a sequence of sigma-algebras such that
(∀n ∈ N) σ(xn′ ,vn′
)06n′6n ⊂ Xn ⊂ Xn+1.
Assumptions on the gradient approximation:
I∑
n∈N√λn‖E(un |Xn)−∇h(xn)‖ < +∞.
I For every z ∈ F, there exists (ζn(z))n∈N ∈ `∞+ (X ) suchthat
∑n∈N
√λnζn(z) < +∞ and
(∀n ∈ N) E(‖un − E(un |Xn)‖2 |Xn)
6 τn‖∇h(xn)−∇h(z)‖2 + ζn(z).
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
18/24
Assumptions
Let X = (Xn)n∈N be a sequence of sigma-algebras such that
(∀n ∈ N) σ(xn′ ,vn′
)06n′6n ⊂ Xn ⊂ Xn+1.
Assumptions on the prox approximations:
I There exist sequences (αn)n∈N and (βn)n∈N in [0,+∞[such that
∑n∈N√λnαn < +∞,
∑n∈N λnβn < +∞, and
(∀n ∈ N)(∀x ∈ H) ‖proxγnfnx−proxγnfx‖ 6 αn‖x‖+βn.
I∑
n∈N λn√E(‖bn‖2 |Xn) < +∞ and∑
n∈N λn√E(‖cn‖2 |Xn) < +∞.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
19/24
Convergence Result
I F: set of solutions to the primal problemI F∗: set of solutions to the dual problem
Under the previous assumptions, the sequence (xn)n∈Nconverges weakly a.s. to an F-valued random variable and thesequence (vn)n∈N converges weakly a.s. to an F∗-valued ran-dom variable.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
20/24
Online Image Recovery
OBSERVATION MODEL
(∀n ∈ N) zn = Knx + en,
where• x ∈ H = RN : unknown image• Kn: RM×N -valued random matrix• en: RM -valued random noise vector.
OBJECTIVE
recover x from (Kn, zn)n∈N.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
21/24
Application of Primal-Dual Algorithm
FORMULATION
I Mean square error criterion
(∀x ∈ RN ) h(x) =1
2E‖K0x− z0‖2,
assuming that (Kn, zn)n∈N are identically distributedI Statistics of (Kn, zn)n∈N learnt online⇒ Approximation to∇h(xn):
un =1
mn+1
mn+1−1∑n′=0
K>n′(Kn′xn − zn′)
where (mn)n∈N is strictly increasing sequence in NI f and g1 ◦ L1 (q = 1): regularization terms
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
21/24
Application of Primal-Dual Algorithm
I Mean square error criterion
(∀x ∈ RN ) h(x) =1
2E‖K0x− z0‖2,
assuming that (Kn, zn)n∈N are identically distributedI Statistics of (Kn, zn)n∈N learnt online⇒ Approximation to∇h(xn):
un =1
mn+1
mn+1−1∑n′=0
K>n′(Kn′xn − zn′)
where (mn)n∈N is strictly increasing sequence in N⇒ recursive computation: un = Rnxn − cn with
Rn =1
mn+1
mn+1−1∑n′=0
K>n′Kn′ =mn
mn+1Rn−1+
1
mn+1
mn+1−1∑n′=mn
K>n′Kn′ .
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
21/24
Application of Primal-Dual Algorithm
CONDITIONS FOR CONVERGENCE
I (Kn, en)n∈N is an i.i.d. sequence such that E‖K0‖4 < +∞and E‖e0‖4 < +∞.
I Approximation to ∇h(xn):
un =1
mn+1
mn+1−1∑n′=0
K>n′(Kn′xn − zn′)
where (mn)n∈N is strictly increasing sequence in N suchthat mn = O(n1+δ) with δ ∈ ]0,+∞[.
I λn = O(n−κ), where κ ∈ ]1− δ, 1] ∩ [0, 1].I fn ≡ f and the domain of f is bounded.I bn ≡ 0 and cn ≡ 0.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
22/24
Simulation example
I Grayscale image of size 256× 256 with pixel values in[0, 255]
I Stochastic blur (uniform i.i.d. subsampling of a uniform5× 5 blur performed in the discrete Fourier domain with70% frequency bins set to zero).
I Additive white N (0, 52) noise.I f = ι[0,255]N and g1 ◦ L1 = isotropic total variation.I Parameter choice:
(∀n ∈ N)
{mn = n1.1
λn = (1 + (n/500)0.95)−1.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
22/24
Simulation example
Original image x Restored image (SNR = 28.1 dB)
Degraded image 1 (SNR = 0.14 dB) Degraded image 2 (SNR = 12.0 dB)
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
22/24
Simulation example
0 50 100 150 200 250 300
×104
0
0.5
1
1.5
2
2.5
3
3.5
4
‖xn − x∞‖ versus the iteration number n.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
23/24
Conclusion
• Investigation of stochastic variants of Forward-Backwardand Primal-Dual proximal algorithms.
• Stochastic approximations to both smooth and non smoothconvex functions.
• Extension to monotone inclusion problems.
• Theoretical guaranties of convergence.
• Novel application to online image recovery.
Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion
24/24
Some referencesP. L. Combettes and V. R. Wajs,Signal recovery by proximal forward-backward splittingMultiscale Model. Simul., vol. 4, pp. 1168–1200, 2005.
P. L. Combettes and J.-C. PesquetProximal splitting methods in signal processingin Fixed-Point Algorithms for Inverse Problems in Science and Engineering,H. H. Bauschke, R. Burachik, P. L. Combettes, V. Elser, D. R. Luke, and H.Wolkowicz editors. Springer-Verlag, New York, pp. 185-212, 2011.
P. Combettes and J.-C. PesquetStochastic quasi-Fejer block-coordinate fixed point iterations with randomsweepingSIAM Journal on Optimization, vol. 25, no. 2, pp. 1221-1248, July 2015.
N. Komodakis and J.-C. PesquetPlaying with duality: An overview of recent primal-dual approaches for solvinglarge-scale optimization problemsIEEE Signal Processing Magazine, vol. 32, no 6, pp. 31-54, Nov. 2015.
P. L. Combettes and J.-C. PesquetStochastic approximations and perturbations in forward-backward splitting formonotone operatorsPure and Applied Functional Analysis, vol. 1, no 1, pp. 13-37, Jan. 2016.