patch matching under non gaussian noise - u-bordeaux.frjaujol/isis/121205/deledalle.pdf · gdr isis...

48
GDR ISIS – Mod´ elisation math´ ematique des textures el´ ecom ParisTech, France, December 05, 2012 Patch matching under non Gaussian noise Charles Deledalle 1 , Florence Tupin 2 , Lo¨ ıc Denis 3 , Martin Royer 2 1 CNRS, Institut de Math´ ematiques de Bordeaux, Universit´ e Bordeaux 1, France 2 Institut Telecom, Telecom ParisTech, CNRS LTCI, Paris, France 3 Telecom Saint-Etienne, France December 05, 2012 C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 1 / 24

Upload: others

Post on 02-Feb-2021

0 views

Category:

Documents


0 download

TRANSCRIPT

  • GDR ISIS – Modélisation mathématique des texturesTélécom ParisTech, France, December 05, 2012

    Patch matching under non Gaussian noise

    Charles Deledalle1, Florence Tupin2, Löıc Denis3, Martin Royer2

    1 CNRS, Institut de Mathématiques de Bordeaux, Université Bordeaux 1, France2 Institut Telecom, Telecom ParisTech, CNRS LTCI, Paris, France

    3 Telecom Saint-Etienne, France

    December 05, 2012

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 1 / 24

  • Motivation

    Increasing use of patches to model images

    Texture synthesis,

    Inpainting,

    Image editing,

    Denoising,

    Super-resolution,

    Image registration,

    Stereo vision,

    Object tracking.

    Image model based on the natural redundancy of patches

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 2 / 24

  • Motivation

    Increasing use of patches to model images

    Texture synthesis,

    Inpainting,

    Image editing,

    Denoising,

    Super-resolution,

    Image registration,

    Stereo vision,

    Object tracking.

    (a) Mitochondrion in microscopy

    c©Chandra

    (b) Supernova in X-ray imagery

    c©DLR

    (c) Polarimetric SAR imagery

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 2 / 24

  • Motivation

    Increasing use of patches to model images

    Texture synthesis,

    Inpainting,

    Image editing,

    Denoising,

    Super-resolution,

    Image registration,

    Stereo vision,

    Object tracking.

    How to compare noisy patches together?

    ︸ ︷︷ ︸?

    ︸ ︷︷ ︸?

    ︸ ︷︷ ︸?

    How to take into account the noise model?

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 2 / 24

  • Motivation

    Increasing use of patches to model images

    Texture synthesis,

    Inpainting,

    Image editing,

    Denoising,

    Super-resolution,

    Image registration,

    Stereo vision,

    Object tracking.

    How to compare noisy patches to a noise-free ones?

    ︸ ︷︷ ︸?

    ︸ ︷︷ ︸?

    ︸ ︷︷ ︸?

    How to take into account the noise model?

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 2 / 24

  • Motivation

    Increasing use of patches to model images

    Texture synthesis,

    Inpainting,

    Image editing,

    Denoising,

    Super-resolution,

    Image registration,

    Stereo vision,

    Object tracking.

    How to compare noisy patches to a noise-free ones?

    ︸ ︷︷ ︸?

    ︸ ︷︷ ︸?

    ︸ ︷︷ ︸?

    How to take into account the noise model?

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 2 / 24

  • Table of contents

    1 A similarity criterion to compare noisy patches

    2 Compare noisy patches to noise-free ones with contrast invariance

    3 Conclusions and perspectives

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 3 / 24

  • Table of contents

    1 A similarity criterion to compare noisy patches

    2 Compare noisy patches to noise-free ones with contrast invariance

    3 Conclusions and perspectives

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 4 / 24

  • Motivation

    (a) Microscopy

    c©Chandra

    (b) Astronomy

    c©DLR

    (c) SAR polarimetry

    ︸ ︷︷ ︸?

    ︸ ︷︷ ︸?

    ︸ ︷︷ ︸?

    How to compare noisy patches?

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 4 / 24

  • Patch-similarity from the squares differences

    How to compare noisy patches?

    Assume noise is additive and Gaussian such that:

    ︸ ︷︷ ︸v1

    = ︸ ︷︷ ︸u1

    + ︸ ︷︷ ︸n1

    and ︸ ︷︷ ︸v2

    = ︸ ︷︷ ︸u2

    + ︸ ︷︷ ︸n2

    Compare patches by the use of the squares differences:

    when u1 = u2 :

    (−

    )2= is low⇒ decide “similar”

    when u1 6= u2 :(

    −)2

    = is high⇒ decide “dissimilar”

    What about non-Gaussian noise?

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 5 / 24

  • Patch-similarity from the squares differences

    How to compare noisy patches?

    Assume noise is additive and Gaussian such that:

    ︸ ︷︷ ︸v1

    = ︸ ︷︷ ︸u1

    + ︸ ︷︷ ︸n1

    and ︸ ︷︷ ︸v2

    = ︸ ︷︷ ︸u2

    + ︸ ︷︷ ︸n2

    Compare patches by the use of the squares differences:

    when u1 = u2 :

    (−

    )2= is low⇒ decide “similar”

    when u1 6= u2 :(

    −)2

    = is high⇒ decide “dissimilar”

    What about non-Gaussian noise?

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 5 / 24

  • Limits of the squares differences

    Beyond the Gaussian noise assumption

    Noise can be non-additive and/or non-Gaussian, e.g., for Poisson noise:

    ︸ ︷︷ ︸v1

    = ︸ ︷︷ ︸u1

    + ︸ ︷︷ ︸n1

    and ︸ ︷︷ ︸v2

    = ︸ ︷︷ ︸u2

    + ︸ ︷︷ ︸n2

    The squares differences is no longer discriminant:

    when u1 = u2 :

    (−

    )2=

    when u1 6= u2 :(

    −)2

    =

    The squared differences are not discriminant when noise departs from Gaussian

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 6 / 24

  • Limits of the squares differences

    Beyond the Gaussian noise assumption

    Noise can be non-additive and/or non-Gaussian, e.g., for Poisson noise:

    ︸ ︷︷ ︸v1

    = ︸ ︷︷ ︸u1

    + ︸ ︷︷ ︸n1

    and ︸ ︷︷ ︸v2

    = ︸ ︷︷ ︸u2

    + ︸ ︷︷ ︸n2

    The squares differences is no longer discriminant:

    when u1 = u2 :

    (−

    )2=

    when u1 6= u2 :(

    −)2

    =

    The squared differences are not discriminant when noise departs from Gaussian

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 6 / 24

  • Similarity through variance stabilization

    Variance stabilization approach

    Use an application s which stabilizes the variance for a specific noise model

    Evaluate the squared differences between the transformed patches:(s

    ( )− s

    ( ))2=

    (−

    )2,

    Example

    Gamma noise (multiplicative) and the homomorphic approach:

    s(V ) = log V

    Poisson noise and the Anscombe transform:

    s(V ) = 2

    √V +

    3

    8

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 7 / 24

  • Similarity through variance stabilization

    Variance stabilization approach

    Use an application s which stabilizes the variance for a specific noise model

    Evaluate the squared differences between the transformed patches:(s

    ( )− s

    ( ))2=

    (−

    )2,

    Example

    Gamma noise (multiplicative) and the homomorphic approach:

    s(V ) = log V

    Poisson noise and the Anscombe transform:

    s(V ) = 2

    √V +

    3

    8

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 7 / 24

  • Similarity with variance stabilization

    Limits

    Only heuristic

    No optimality results

    Does not take into account the statistics of the transformed data

    Does not apply to all noise distributionse.g., multi-modal distributions like interferometric phase distribution

    (a) Image with impulse noise

    c©ONERA c©CNES

    (b) SAR interferometric phase

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 8 / 24

  • Similarity in a detection framework

    Similarity in the light of detection theory

    Similarity can be defined as an hypothesis test (i.e., a parameter test):

    H0 : u1 = u2 ≡ u12 (null hypothesis)H1 : u1 6= u2 (alternative hypothesis)

    Its performance can be measured as:

    PFA = P(decide “dissimilar” | u12,H0) (false-alarm rate)PD = P(decide “dissimilar” | u1,u2,H1) (detection rate)

    The likelihood ratio (LR) test maximizes PD for any PFA:

    L(v1,v2) =p(v1,v2 | u12,H0)p(v1,v2 | u1,u2,H1)

    ← given by the noise distribution model

    → Problem: u12, u1 and u2 are unknown

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 9 / 24

  • Similarity in a detection framework

    Generalized likelihood ratio (GLR)

    Replace u12, u1 and u2 with maximum likelihood estimates (MLE)

    Define the (negative log) generalized likelihood ratio test:

    − logLG(v1,v2) = − logsupt p(v1,v2 | u12 = t,H0)

    supt1,t2 p(v1,v2 | u1 = t1,u2 = t2,H1)

    = − logp(v1 | u1 = t̂12) p(v2 | u2 = t̂12)p(v1 | u1 = t̂1) p(v2 | u2 = t̂2)

    Illustration

    0 2 4 6 8 100

    0.2

    0.4

    0.6

    0.8

    1

    Noise−free values space (domain of t)

    Possib

    ility

    mesure

    of u

    1=

    u2

    0 2 4 6 8 100

    0.2

    0.4

    0.6

    0.8

    1

    Noise−free values space (domain of t)

    Possib

    ility

    mesure

    of u

    1=

    u2

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 10 / 24

  • Similarity in a detection framework

    Generalized likelihood ratio (GLR)

    Replace u12, u1 and u2 with maximum likelihood estimates (MLE)

    Define the (negative log) generalized likelihood ratio test:

    − logLG(v1,v2) = − logsupt p(v1,v2 | u12 = t,H0)

    supt1,t2 p(v1,v2 | u1 = t1,u2 = t2,H1)

    = − logp(v1 | u1 = t̂12) p(v2 | u2 = t̂12)p(v1 | u1 = t̂1) p(v2 | u2 = t̂2)

    Equal self similarity

    Assume v1 = v2, then:

    − logp

    (v1 =

    ∣∣∣∣∣ u1 =)p

    (v2 =

    ∣∣∣∣∣ u2 =)

    p

    (v1 =

    ∣∣∣∣∣ u1 =)p

    (v2 =

    ∣∣∣∣∣ u2 =) = 0

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 10 / 24

  • Similarity in a detection framework

    Generalized likelihood ratio (GLR)

    Replace u12, u1 and u2 with maximum likelihood estimates (MLE)

    Define the (negative log) generalized likelihood ratio test:

    − logLG(v1,v2) = − logsupt p(v1,v2 | u12 = t,H0)

    supt1,t2 p(v1,v2 | u1 = t1,u2 = t2,H1)

    = − logp(v1 | u1 = t̂12) p(v2 | u2 = t̂12)p(v1 | u1 = t̂1) p(v2 | u2 = t̂2)

    Maximal self similarity

    Assume v1 6= v2, then:

    − logp

    (v1 =

    ∣∣∣∣∣ u1 =)p

    (v2 =

    ∣∣∣∣∣ u2 =)

    p

    (v1 =

    ∣∣∣∣∣ u1 =)p

    (v2 =

    ∣∣∣∣∣ u2 =) > 0

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 10 / 24

  • Similarity in a detection framework

    Other similarity criteria have been proposed:

    Bayesian joint likelihood

    ∫p(v1 | u1 =t) p(v2 | u2 =t)

    p(u12 =t)

    dt

    [Deledalle et al., 2009]

    [Yianilos, 1995, Matsushita and Lin, 2007]

    Maximum joint likelihood suptp(v1 | u1 =t) p(v2 | u2 =t)

    [Alter et al., 2006]

    Bayesian likelihood ratio

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt∫

    p(v1 | u1 =t) p(u1 =t) dt∫p(v2 | u2 =t) p(u2 =t) dt

    [Minka, 1998, Minka, 2000]

    Mutual information kernel

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt√∫

    p(v1 | u1 =t)2 p(u1 =t) dt∫p(v2 | u2 =t)2 p(u2 =t) dt

    [Seeger, 2002]

    GLRsupt p(v1 | u1 =t) p(v2 | u2 =t)

    supt p(v1 | u1 =t) supt p(v2 | u2 =t)

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 11 / 24

  • Similarity in a detection framework

    Other similarity criteria have been proposed:

    Bayesian joint likelihood

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt

    [Deledalle et al., 2009]

    [Yianilos, 1995, Matsushita and Lin, 2007]

    Maximum joint likelihood suptp(v1 | u1 =t) p(v2 | u2 =t)

    [Alter et al., 2006]

    Bayesian likelihood ratio

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt∫

    p(v1 | u1 =t) p(u1 =t) dt∫p(v2 | u2 =t) p(u2 =t) dt

    [Minka, 1998, Minka, 2000]

    Mutual information kernel

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt√∫

    p(v1 | u1 =t)2 p(u1 =t) dt∫p(v2 | u2 =t)2 p(u2 =t) dt

    [Seeger, 2002]

    GLRsupt p(v1 | u1 =t) p(v2 | u2 =t)

    supt p(v1 | u1 =t) supt p(v2 | u2 =t)

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 11 / 24

  • Similarity in a detection framework

    Other similarity criteria have been proposed:

    Bayesian joint likelihood

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt

    [Deledalle et al., 2009]

    [Yianilos, 1995, Matsushita and Lin, 2007]

    Maximum joint likelihood suptp(v1 | u1 =t) p(v2 | u2 =t)

    [Alter et al., 2006]

    Bayesian likelihood ratio

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt∫

    p(v1 | u1 =t) p(u1 =t) dt∫p(v2 | u2 =t) p(u2 =t) dt

    [Minka, 1998, Minka, 2000]

    Mutual information kernel

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt√∫

    p(v1 | u1 =t)2 p(u1 =t) dt∫p(v2 | u2 =t)2 p(u2 =t) dt

    [Seeger, 2002]

    GLRsupt p(v1 | u1 =t) p(v2 | u2 =t)

    supt p(v1 | u1 =t) supt p(v2 | u2 =t)

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 11 / 24

  • Similarity in a detection framework

    Other similarity criteria have been proposed:

    Bayesian joint likelihood

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt

    [Deledalle et al., 2009]

    [Yianilos, 1995, Matsushita and Lin, 2007]

    Maximum joint likelihood suptp(v1 | u1 =t) p(v2 | u2 =t)

    [Alter et al., 2006]

    Bayesian likelihood ratio

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt∫

    p(v1 | u1 =t) p(u1 =t) dt∫p(v2 | u2 =t) p(u2 =t) dt

    [Minka, 1998, Minka, 2000]

    Mutual information kernel

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt√∫

    p(v1 | u1 =t)2 p(u1 =t) dt∫p(v2 | u2 =t)2 p(u2 =t) dt

    [Seeger, 2002]

    GLRsupt p(v1 | u1 =t) p(v2 | u2 =t)

    supt p(v1 | u1 =t) supt p(v2 | u2 =t)

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 11 / 24

  • Similarity in a detection framework

    Other similarity criteria have been proposed:

    Bayesian joint likelihood

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt

    [Deledalle et al., 2009]

    [Yianilos, 1995, Matsushita and Lin, 2007]

    Maximum joint likelihood suptp(v1 | u1 =t) p(v2 | u2 =t)

    [Alter et al., 2006]

    Bayesian likelihood ratio

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt∫

    p(v1 | u1 =t) p(u1 =t) dt∫p(v2 | u2 =t) p(u2 =t) dt

    [Minka, 1998, Minka, 2000]

    Mutual information kernel

    ∫p(v1 | u1 =t) p(v2 | u2 =t) p(u12 =t) dt√∫

    p(v1 | u1 =t)2 p(u1 =t) dt∫p(v2 | u2 =t)2 p(u2 =t) dt

    [Seeger, 2002]

    GLRsupt p(v1 | u1 =t) p(v2 | u2 =t)

    supt p(v1 | u1 =t) supt p(v2 | u2 =t)

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 11 / 24

  • Patch similarity criteria – Generalized likelihood ratio

    name pdf − logLG Stabilization Sq. diff

    Gaussian e− (v−u)

    2

    2σ2√2πσ

    (v1−v2)2

    Poisson uve−u

    v!log

    (2v1+v2v1

    v1v2v2

    (v1+v2)v1+v2

    ) (√v1+3/8−

    √v2+3/8

    )2

    Gamma LLvL−1e−

    Lvu

    Γ(L)uLlog(√

    v1v2

    +√v2v1

    )−log 2

    (log

    v1v2

    )2

    The three criteria for three noise models

    Illustration with Gamma noise

    1 When u1 = u2 = u12, the residue is statistically small:

    − logLG

    (,

    )= is low⇒ decide “similar”

    2 Then, when u1 6= u2, the residue is statistically higher:

    − logLG

    (,

    )= is high⇒ decide “dissimilar”

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 12 / 24

  • Evaluation of similarity criteria – Detection performance

    0 0.2 0.4 0.6 0.8 10

    0.1

    0.2

    0.3

    0.4

    0.5

    0.6

    0.7

    0.8

    0.9

    1

    Probability of false alarmP

    rob

    ab

    ility

    of

    de

    tectio

    n

    Gamma

    L GSGQ GGeneralized likelihood ratio

    Variance stabilization

    Squared differences

    Maximum joint likelihood [Alter et al., 2006]

    Mutual information kernel [Seeger, 2002]

    Bayesian likelihood ratio [Minka, 1998, Minka, 2000]

    Bayesian joint likelihood [Yianilos, 1995, Matsushita and Lin, 2007]

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 13 / 24

  • Evaluation of similarity criteria – Detection performance

    0 0.2 0.4 0.6 0.8 10

    0.1

    0.2

    0.3

    0.4

    0.5

    0.6

    0.7

    0.8

    0.9

    1

    Probability of false alarmP

    rob

    ab

    ility

    of

    de

    tectio

    n

    Poisson

    L GK BSL BGQ BQ GGeneralized likelihood ratio

    Variance stabilization

    Squared differences

    Maximum joint likelihood [Alter et al., 2006]

    Mutual information kernel [Seeger, 2002]

    Bayesian likelihood ratio [Minka, 1998, Minka, 2000]

    Bayesian joint likelihood [Yianilos, 1995, Matsushita and Lin, 2007]

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 13 / 24

  • Evaluation on denoising – Non-local filtering with GLRG

    am

    ma

    c©ONERA c©CNES

    Po

    isso

    n

    c©Chandra

    (a) Noisy image (b) Euclidean distance (c) GLR

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 14 / 24

  • Stereo-vision

    (a) Noisy imageG

    am

    ma

    (b) Euclidean distance (c) GLR

    (d) Ground truth

    Po

    isso

    n

    (e) Euclidean distance (f) GLR

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 15 / 24

  • Motion tracking – Glacier monitoring with a stereo pair of SAR images

    (g) Noisy image (h) Euclidean distance (i) GLR

    Glacier of Argentière. With GLR, the estimated speeds matches with the ground truth: average over the surface

    of 12.27 cm/day and a maximum of 41.12 cm/day in the areas with crevasses.

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 16 / 24

  • Comparison of noisy patches beyond Gaussian noise

    Conclusion

    Similarity between noisy patches expressed as an hypothesis test

    Among 7 similarity criteria, GLR provides the best performance

    Apply even when variance stabilization is not possible

    Easy to derive as long as the MLE is known in closed form

    Offers good theoretical properties:

    Max. self sim. Eq. self sim. Id. of indiscernible Invariance Asym. CFAR Asym. UMPI

    Euclidean kernel√ √ √

    × × ×

    Stabilization transform ×

    Bayesian joint lik. × × × × × ×

    Maximum joint lik. × × × × × ×

    Bayesian lik. ratio × × ×√

    × ×

    Mutual info. kernel√ √ √ √

    × ×

    GLR√ √ √ √ √ √

    [Deledalle et al., 2012] Deledalle, C., Denis, L. , Tupin, F. (2012).

    How to compare noisy patches? Patch similarity beyond Gaussian noise.

    International Journal of Computer Vision, vol. 99, no. 1, pp. 86-102, 2012

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 17 / 24

  • Table of contents

    1 A similarity criterion to compare noisy patches

    2 Compare noisy patches to noise-free ones with contrast invariance

    3 Conclusions and perspectives

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 18 / 24

  • Motivation

    (a) Microscopy (b) Dictionnary

    ︸ ︷︷ ︸?

    How to compare noisy patches to noise-free ones?

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 18 / 24

  • Compare noisy patches to noise-free ones with contrast invariance

    ︸ ︷︷ ︸v1

    = ︸ ︷︷ ︸u1

    + ︸ ︷︷ ︸n1

    where ︸ ︷︷ ︸v1

    matches , and

    First intuition: use the correlation

    Compute the modulus of the correlation

    C(v1,u2) =

    ∣∣∣∣∣∑k(v1,k − v̄1)(u2,k − ū2)√∑

    k(v1,k − v̄1)2∑k(u2,k − ū2)2

    ∣∣∣∣∣such that

    u1 = αu2 + β : C(

    ,

    )=

    ∣∣∣∣∑ ∣∣∣∣ = 0.97 ⇒ decide “similar”u1 6= αu2 + β : C

    (,

    )=

    ∣∣∣∣∑ ∣∣∣∣ = 0.07 ⇒ decide “dissimilar”

    Is the correlation a good detector?

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 19 / 24

  • Compare noisy patches to noise-free ones with contrast invariance

    ︸ ︷︷ ︸v1

    = ︸ ︷︷ ︸u1

    + ︸ ︷︷ ︸n1

    where ︸ ︷︷ ︸v1

    matches , and

    First intuition: use the correlation

    Compute the modulus of the correlation

    C(v1,u2) =

    ∣∣∣∣∣∑k(v1,k − v̄1)(u2,k − ū2)√∑

    k(v1,k − v̄1)2∑k(u2,k − ū2)2

    ∣∣∣∣∣such that

    u1 = αu2 + β : C(

    ,

    )=

    ∣∣∣∣∑ ∣∣∣∣ = 0.97 ⇒ decide “similar”u1 6= αu2 + β : C

    (,

    )=

    ∣∣∣∣∑ ∣∣∣∣ = 0.07 ⇒ decide “dissimilar”

    Is the correlation a good detector?

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 19 / 24

  • Similarity in a detection framework

    Detection theory

    Similarity can be defined as an hypothesis test (i.e., a parameter test):

    H0 : u1 = αu2 + β (null hypothesis)H1 : u1 6= αu2 + β (alternative hypothesis)

    The likelihood ratio (LR) test maximizes PD for any PFA:

    L(v1,u2) =p(v1 | αu2 + β,H0)p(v1 | u1,H1)

    ← given by the noise distribution model

    → Problem: α, β, u1 are unknown

    Generalized likelihood ratio (GLR)

    Replace α, β and u1 with maximum likelihood estimates (MLE)

    Define the (negative log) generalized likelihood ratio test:

    − logLG(v1,u2) = − logsupα,β p(v1 | u1 = αu2 + β,H0)

    supt1 p(v1 | u1 = t1,H1)

    = − logp(v1 | u1 = α̂u2 + β̂)

    p(v1 | u1 = t̂1)

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 20 / 24

  • Similarity in a detection framework

    GLR for Gaussian noise

    Obtain α̂ and β̂ by minimizing − log p(v1 | u1 = α̂u2 + β̂) ∝∥∥∥v1 − α̂u2 − β̂∥∥∥2

    ⇒ α̂ =[∑

    k(v1,k − v̄1)(u2,k − ū2)]2∑

    k(u2,k − ū2)2and β̂ = v̄1 − α̂ū2

    Injecting α̂ and β̂ gives

    − logLG(v1,u2) =[1− C2(v1,u2)

    ] ∑k(v1,k − v̄1)2

    2σ2

    For a fixed v1, maximizing GLR is equivalent in max. the correlation or the likelihood

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 21 / 24

  • Similarity in a detection framework

    GLR for Gaussian noise

    Obtain α̂ and β̂ by minimizing − log p(v1 | u1 = α̂u2 + β̂) ∝∥∥∥v1 − α̂u2 − β̂∥∥∥2

    ⇒ α̂ =[∑

    k(v1,k − v̄1)(u2,k − ū2)]2∑

    k(u2,k − ū2)2and β̂ = v̄1 − α̂ū2

    Injecting α̂ and β̂ gives

    − logLG(v1,u2) =[1− C2(v1,u2)

    ] ∑k(v1,k − v̄1)2

    2σ2

    For a fixed v1, maximizing GLR is equivalent in max. the correlation or the likelihood

    Illustration

    The correlation does not take into account the noise, while GLR does

    − log C(

    ,

    )> − log C

    (,

    )

    while − logLG

    (,

    )� − logLG

    (,

    )

    In fact 0× + β “explains” better than .

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 21 / 24

  • Similarity in a detection framework

    GLR for Gaussian noise

    Obtain α̂ and β̂ by minimizing − log p(v1 | u1 = α̂u2 + β̂) ∝∥∥∥v1 − α̂u2 − β̂∥∥∥2

    ⇒ α̂ =[∑

    k(v1,k − v̄1)(u2,k − ū2)]2∑

    k(u2,k − ū2)2and β̂ = v̄1 − α̂ū2

    Injecting α̂ and β̂ gives

    − logLG(v1,u2) =[1− C2(v1,u2)

    ] ∑k(v1,k − v̄1)2

    2σ2

    For a fixed v1, maximizing GLR is equivalent in max. the correlation or the likelihood

    GLR for non-Gaussian noise

    For gamma and Poisson distributions: no closed-form expressions

    Redefine contrast invariance, for instance, as u1 = β̂u2α̂

    Estimate α̂ and β̂ iteratively by Newton’s method (generally, only few iterations are required)

    Evaluate the negative log-likelihood ratio at α̂ and β̂

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 21 / 24

  • Evaluation of similarity criteria – Detection performance

    0 0.2 0.4 0.6 0.8 10

    0.1

    0.2

    0.3

    0.4

    0.5

    0.6

    0.7

    0.8

    0.9

    1

    Probability of false alarm

    Pro

    ba

    bili

    ty o

    f d

    ete

    ctio

    n

    Gaussian

    LG

    CGeneralized likelihood ratio

    Correlation

    + Stabilization

    GLR Gaussien + Stabilization

    GLR Gaussien

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 22 / 24

  • Evaluation of similarity criteria – Detection performance

    0 0.2 0.4 0.6 0.8 10

    0.1

    0.2

    0.3

    0.4

    0.5

    0.6

    0.7

    0.8

    0.9

    1

    Probability of false alarm

    Pro

    ba

    bili

    ty o

    f d

    ete

    ctio

    n

    Gamma

    LGSGC + SGeneralized likelihood ratio

    Correlation + Stabilization

    GLR Gaussien + Stabilization

    GLR Gaussien

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 22 / 24

  • Evaluation of similarity criteria – Detection performance

    0 0.2 0.4 0.6 0.8 10

    0.1

    0.2

    0.3

    0.4

    0.5

    0.6

    0.7

    0.8

    0.9

    1

    Probability of false alarm

    Pro

    ba

    bili

    ty o

    f d

    ete

    ctio

    n

    Poisson

    LGSGC + SGeneralized likelihood ratio

    Correlation + Stabilization

    GLR Gaussien + Stabilization

    GLR Gaussien

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 22 / 24

  • Evaluation on denoising – Patch dictionnary based denoisingG

    am

    ma

    Ga

    mm

    a

    c©ONERA c©CNES

    (a) Noisy image (b) Result (c) Dictionnary

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 23 / 24

  • Table of contents

    1 A similarity criterion to compare noisy patches

    2 Compare noisy patches to noise-free ones with contrast invariance

    3 Conclusions and perspectives

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 24 / 24

  • Conclusions and perspectives

    Take home messages

    Comparison of patches can be expressed as an hypothesis testRobust noisy patch comparisonRobust noisy versus noise free patch comparison with contrast invariance

    Perform slightly better than variance stabilization

    Apply even when variance stabilization is not possible

    Can be computed in closed-form in many cases (or in few iterations)

    Perspectives

    Inject geometric invariance

    Inject priors

    Deal with blur, subsampling, masking, . . .

    Patch dictionary learning based patch matching

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 24 / 24

  • Thanks for your attention

    [email protected]

    http://www.math.u-bordeaux1.fr/~cdeledal/

    http://www.math.u-bordeaux1.fr/~cdeledal/

  • [Alter et al., 2006] Alter, F., Matsushita, Y., and Tang, X. (2006).An intensity similarity measure in low-light conditions.Lecture Notes in Computer Science, 3954:267–280.

    [Deledalle et al., 2009] Deledalle, C.-A., Denis, L., and Tupin, F. (2009).Iterative Weighted Maximum Likelihood Denoising with Probabilistic Patch-Based Weights.IEEE Trans. Image Process., 18(12):2661–2672.

    [Matsushita and Lin, 2007] Matsushita, Y. and Lin, S. (2007).A Probabilistic Intensity Similarity Measure based on Noise Distributions.In IEEE Comput. Vis. and Pattern Recognition (CVPR), pages 1–8. IEEE.

    [Minka, 1998] Minka, T. (1998).Bayesian Inference, Entropy, and the Multinomial Distribution.Technical report, Carnegie Mellon University.

    [Minka, 2000] Minka, T. (2000).Distance measures as prior probabilities.Technical report, Carnegie Mellon University.

    [Seeger, 2002] Seeger, M. (2002).Covariance kernels from Bayesian generative models.In Advances in Neural Inf. Process. Syst. (NIPS), volume 2, pages 905–912. MIT Press.

    [Yianilos, 1995] Yianilos, P. (1995).Metric learning via normal mixtures.Technical report, NEC Research Institute, Princeton, NJ.

    C. Deledalle (CNRS/IMB) Patch matching December 05, 2012 26 / 24

    A similarity criterion to compare noisy patchesCompare noisy patches to noise-free ones with contrast invarianceConclusions and perspectives