QUASI MAXIMUM LIKELIHOOD
BLIND DECONVOLUTION
QUASI MAXIMUM LIKELIHOOD
BLIND DECONVOLUTION
Alexander BronsteinAlexander Bronstein
BIBLIOGRAPHYBIBLIOGRAPHY 22
A. Bronstein, M. Bronstein, and M. Zibulevsky, "Quasi maximum likelihood blind deconvolution: super- ans sub-Gaussianity vs. asymptotic stability", submitted to IEEE Trans. Sig. Proc.
A. Bronstein, M. Bronstein, M. Zibulevsky, and Y. Y. Zeevi, "Quasi maximum likelihood blind deconvolution: asymptotic performance analysis", submitted to IEEE Trans. Information Theory.
A. Bronstein, M. Bronstein, and M. Zibulevsky, "Relative optimization for blind deconvolution", submitted to IEEE Trans. Sig. Proc.
A. Bronstein, M. Bronstein, M. Zibulevsky, and Y. Y. Zeevi, "Quasi maximum likelihood blind deconvolution of images acquired through scattering media", Submitted to ISBI04.
A. Bronstein, M. Bronstein, M. Zibulevsky, and Y. Y. Zeevi, "Quasi maximum likelihood blind deconvolution of images using optimal sparse representations", CCIT Report No. 455 (EE No. 1399), Dept. of Electrical Engineering, Technion, Israel, December 2003.
A. Bronstein, M. Bronstein, and M. Zibulevsky, "Blind deconvolution with relative Newton method", CCIT Report No. 444 (EE No. 1385), Dept. of Electrical Engineering, Technion, Israel, October 2003.
AGENDAAGENDA 33
Introduction
QML blind deconvolution
Asymptotic analysis
Relative Newton
Generalizations
Problem formulation
Applications
BLIND DECONVOLUTION PROBLEMBLIND DECONVOLUTION PROBLEM 44
n nk n kk
x w s
ns
n
nwsource signal
convolution kernel
observed signal
sensor noise signal
W H
s x y
CONVOLUTION MODEL DECONVOLUTION
n nk n kk
cy h x s
nx
nh
nyrestoration kernel
source estimate
arbitrary scaling factor
arbitrary delay
c
APPLICATIONSAPPLICATIONS 55
Acoustics, speech processing DEREVERBERATION
Optics, image processing, biomedical imaging DEBLURRING
Communications CHANNEL EQUALIZATION
Control SYSTEM IDENTIFICATION
Statistics, finances ARMA ESTIMATION
AGENDAAGENDA 66
Introduction
QML blind deconvolution
Asymptotic analysis
Relative Newton
Generalizations
ML vs. QML
The choice of φ(s)
Equivariance
Gradient and Hessian
ML BLIND DECONVOLUTIONML BLIND DECONVOLUTION 77
1
1 1 20
; log log minT
inT h
nx h H e d f y
1 is i.i.d. with probability density functionns f s
2 has no zeros on the unit circle, i.e. nh 0iH e
ASSUMPTIONS
MAXIMUM-LIKELIHOOD BLIND DECONVOLUTION:
3 No noise (precisely: no noise model)
4 is zero-mean ns
QUASI ML BLIND DECONVOLUTIONQUASI ML BLIND DECONVOLUTION 88
The true source PDF in usually unknown
log f s
f s
Many times is non-log-concave and not well-suited for
optimization
log f sSubstitute with some model function s
1
1 1 20
; log T
inT
nx h H e d y
PROBLEMS OF MAXIMUM LIKELIHOOD
QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION
THE CHOICE OF (s) THE CHOICE OF (s) 99
log 1s
s s
SUPER-GAUSSIAN SUB-GAUSSIAN
s s
0
0.1 0.01
4
10
EQUIVARIANCE EQUIVARIANCE 1010
ˆ argmin ;h
h x x h
QML estimator of given the observation h x
Theorem: The QML estimator is equivariant, i.e., for every
invertible kernel , it holds
where stands for the impulse response of the inverse of .
1ˆ ˆ h a x a h x
h xa
1a a
ANALYSIS OF ANALYSIS OF ;x h ;h x
GRADIENT & HESSIAN OF GRADIENT & HESSIAN OF 1111
GRADIENT
( ; )x h
11 1
0
; T
k n n kTnk
h y xh
x h
1 1 Th y x J J J
where is the mirror operator. J
HESSIAN
21
2 1
0
; T
n n k n lk l Tnk l
h y x xh h
x h
AGENDAAGENDA 1212
Introduction
QML blind deconvolution
Asymptotic analysis
Relative Newton
Generalizations
Asymptotic Hessian structure
Asymptotic error covariance
Cramér-Rao bounds
Superefficiency
Examples
ASYMPTOTIC HESSIAN AT THE SOLUTION POINT ASYMPTOTIC HESSIAN AT THE SOLUTION POINT 1313
For a sufficiently large sample size , the Hessian becomes
At the solution point, 1 h cw y x h cs
2
2
2
1
1
1
;cs
2 22 E cs E cs cs E cs where
T
ASYMPTOTIC ERROR COVARIANCE ASYMPTOTIC ERROR COVARIANCE 1414
Estimation kernel from the data
Exact restoration kernel
1 argmin ;xh
h E x h cw
ˆ argmin ;h
h x x h
0ˆ;x h x
0;x h 0;E x h
0
;k n n k
k
E E cs csh
cs
1E cs cs The scaling factor has to obey c
ASYMPTOTIC ERROR COVARIANCE ASYMPTOTIC ERROR COVARIANCE 1515
Estimation error
ˆh h h x
From second-order Taylor expansion,
2; ;x h x h h 2; ;cs cs h equivariance
12 ; ;h cs cs
ASYMPTOTIC ERROR COVARIANCE ASYMPTOTIC ERROR COVARIANCE 1616
Asymptotically ( ),
Separable structure:
1
2
002
1
1
1
NN
NN
g
g
g
h
h
h
T
12
2
1
1
1,2,...,
kk
kk
g
g
k N
hh
0 01 gh
ASYMPTOTIC ERROR COVARIANCE ASYMPTOTIC ERROR COVARIANCE 1717
The estimation error covariance matrix
1 1Cov Covh H g H
asymptotically separates to
( )
2 2
2 2 22 4
(0) 202
1 11
1 11
1,2,...,
1
1
k k k kkh
k k k k
k k k k
k k k k
h
E h h E h h
E h h E h h
E g g E g g
E g g E g g
k N
E g
ASYMPTOTIC ERROR COVARIANCE ASYMPTOTIC ERROR COVARIANCE 1818
Asymptotic gradient covariance matrices
2
2
11
1k k k k
k k k k
E g g E g g
E g g E g g T
20
1E g
T
where 2E cs cs 2E cs
ASYMPTOTIC ERROR COVARIANCE ASYMPTOTIC ERROR COVARIANCE 1919
2 2 4 2 2 2 2
( )2 2 2 2 2 2 4 22 4
(0)2
1 2 2 11
2 1 1 21
1
1
kh
h
T
T
Asymptotic signal-to-interference ratio (SIR) estimate:
2 22 42
2 2 2 4 2
2
1
2 1 2
E cs TSIR
NE h x cs
Asymptotic estimation error covariance:
CRAMER-RAO LOWER BOUNDS CRAMER-RAO LOWER BOUNDS 2020
True ML estimator: logs f s
The distribution-dependent parameters simplify to
2
1
1
c
2
( ) (0) 1
1
kh h TT
2
LSL
Asymptotic error covariance simplifies to
2 2log
Cum log , log , , 1
E s E f s
f s f s s s
L
S L
where
CRAMER-RAO LOWER BOUNDS CRAMER-RAO LOWER BOUNDS 2121
1
2 2
T TSIR
N N
2L LL
Asymptotic SIR estimate simplifies to
SUPEREFFICIENCY SUPEREFFICIENCY 2222
0 0P s Let the source be sparse, i.e.,
sLet be the smoothed absolute value with smoothing parameter
sign 2s s s s s s
0 In the limit
1( ) ( )
c
c
cs f s ds f s ds
22 2
p lim Var constkT
T h
0
lim p lim Var 0kT
T h
SUPEREFFICIENCY SUPEREFFICIENCY 2323
Similar results are obtained for uniformly-distributed source
with s s
Can be extended for sources with PDF vanishing outside some
interval.
ASYMPTOTIC STABILITYASYMPTOTIC STABILITY 2424
The QML estimator is said to be asymptotically stable if is a
local minimizer of in the limit .
h x h
;x h T
Theorem: The QML estimator is asymptotically stable if the
following conditions hold:
and is asymptotically unstable if one of the following conditions hold:
2
(1) 0(2) 1(3) 1 0
h x
2
(1 ) 0(2 ) 1(3 ) 1 0
EXAMPLEEXAMPLE 2525
Generalized Laplace distribution
112 1
asb
aba
ef s
1 (Laplacian)a
5 (Gaussian)a
5a
0.5a
STABILITY OF THE SUPER-GAUSSIAN ESTIMATORSTABILITY OF THE SUPER-GAUSSIAN ESTIMATOR 2626
logs f s
2 1
a
210
0 410
SUPER-GAUSSIAN SUB-GAUSSIAN
log 1s
s s
STABILITY OF THE SUB-GAUSSIAN ESTIMATORSTABILITY OF THE SUB-GAUSSIAN ESTIMATOR 2727
logs f s
2 1
a
3
SUPER-GAUSSIAN SUB-GAUSSIAN
s s
5
10
PERFORMANCE OF THE SUPER-GAUSSIAN ESTIMATORPERFORMANCE OF THE SUPER-GAUSSIAN ESTIMATOR 2828
logs f s
1Var hT
a
210
0
410
SUPER-GAUSSIAN
log 1s
s s
PERFORMANCE OF THE SUB-GAUSSIAN ESTIMATORPERFORMANCE OF THE SUB-GAUSSIAN ESTIMATOR 2929
logs f s
1Var hT
aSUB-GAUSSIAN
s s
3
5
10
AGENDAAGENDA 3030
Introduction
QML blind deconvolution
Asymptotic analysis
Relative Newton
GeneralizationsRelative optimization
Relative Newton
Fast Relative Newton
RELATIVE OPTIMIZATION (RO) RELATIVE OPTIMIZATION (RO) 3131
0 Start with and (0)x x(0)h
1 For until convergence0,1,2,...k
4 Update source estimate
2 Start with
( 1) ( 1) ( )k k kx h x
( 1)kh
3 Find such that ( 1)kh ( ) ( 1) ( 1); ;k k kx h x
5 End For
Restoration kernel estimate: (0) (1) ( )...ˆ Kh x h h h
Source estimate: ( )ˆ Ks x x
RELATIVE OPTIMIZATION (RO) RELATIVE OPTIMIZATION (RO) 3232
Observation: The k-th step of the relative optimization algorithm
depends only on ( 1)kw h
Proposition: The sequence of target function values produced by the
relative optimization algorithm is monotonically decreasing, i.e.,
(0) (1) ( 1) ( ) (0) (1) ( 1)... ...; ;k k kx h h h h x h h h
RELATIVE NEWTON RELATIVE NEWTON 3333
Relative Newton = use one Newton step in the RO algorithm
Near the solution point( )kx cs
2
2 ( )
2
1
1
1
;kx
Hd g
Newton system separates to
, ,
, ,
k k k k k k
k k k k k k
H H d g
H H d g
0 0 0H d g
FAST RELATIVE NEWTON FAST RELATIVE NEWTON 3434
Fast relative Newton = use one Newton step with approximate Hessian
in the RO algorithm + regularized approximate Newton system solution.
Approximate Hessian evaluation = order of gradient evaluation
AGENDAAGENDA 3535
Introduction
QML blind deconvolution
Asymptotic analysis
Relative Newton
Generalizations
GENERALIZATIONSGENERALIZATIONS 3636
IIR KERNELS
1 11 1
...
... 1 ...1
N NN N
M LM L
b z b zH zc z c z a z a z
BLOCK PROCESSING ONLINE DECONVOLUTION
MULTI-CHANNEL DECONVOLUTION BSS+BD
DECONVOLUTION OF IMAGES + USE OF SPARSE
REPRESENTATIONS