gaussian mixture model (gmm) using expectationvplab/courses/dvp/pdf/gmm.pdf · ml method for...
TRANSCRIPT
![Page 1: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/1.jpg)
Gaussian Mixture Model (GMM) using Expectation Maximization
(EM) Technique
Book : C.M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006
![Page 2: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/2.jpg)
The Gaussian Distribution
Univariate Gaussian Distribution
2
2
σ2μ)(x
2e
πσ21σμ,|x
−−=)Ν(
Multi-Variate Gaussian Distribution
( ) ( ) ( )
−−−
= − μxμx
21exp
2π1μ,x 1T
21/)|Ν(
mean covariance
We need to estimate these parameters of a distribution One method – Maximum Likelihood (ML) Estimation.
mean variance
![Page 3: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/3.jpg)
ML Method for estimating parameters Consider log of Gaussian Distribution
( ) ( )μxμx21ln
21)2ln(
21)μ,|x(pln 1T −−−−−= −π
Take the derivative and equate it to zero
0),|x(pln =∂
∂μμ
0),|x(pln =∂
∂ μ
=
=N
1nnML x
N1μ ( )( )T
MLn
N
1nMLnML xx
N1 μμ −−=
=
Where, N is the number of samples or data points
![Page 4: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/4.jpg)
Gaussian Mixtures Linear super-position of Gaussians
),|x()x(p kk
K
1kk =
=
μπ N
Number of Gaussians Mixing coefficient: weightage for each Gaussian dist.
Normalization and positivity require ,10 k ≤≤ π 1K
1kk =
=
π
ML does not work here as there is no closed form solutionParameters can be calculated using Expectation Maximization (EM) technique
Consider log likelihood
= ==
==N
1n
K
1kkknk
N
1nn ),|x(Nln)x(pln),,|X(pln μππμ
![Page 5: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/5.jpg)
Example: Mixture of 3 Gaussian
0 0.5 10
0.5
1
(a)0 0.5 1
0
0.5
1
(b)
![Page 6: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/6.jpg)
Latent variable: posterior prob. We can think of the mixing coefficients as prior
probabilities for the components For a given value of ‘x’, we can evaluate the
corresponding posterior probabilities, called responsibilities
)x(p)k|x(p)k(p)x|k(p)x(k ==γ
=
= K
1jjjj
kkk
),|x(
),|x(
μπ
μπ
N
N
From Bayes rule
Latent Variable N
N where, kk =π
Interpret Nk as the effective no. of points assigned to cluster k.
![Page 7: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/7.jpg)
Expectation Maximization EM algorithm is an iterative optimization technique
which is operated locally
Initial point
Optimal point
Estimation step: for given parameter values we can compute the expected values of the latent variable.
Maximization step: updates the parameters of our model based on the latent variable calculated using ML method.
![Page 8: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/8.jpg)
EM Algorithm for GMMGiven a Gaussian mixture model, the goal is to maximize the likelihood function with respect to the parameters comprising the means and covariances of the components and the mixing coefficients).
1. Initialize the means , covariances and mixing coefficients , and evaluate the initial value of the log likelihood.
jμ jjπ
2. E step. Evaluate the responsibilities using the current parameter values
=
= K
1jjjj
kkkj
),|x(
),|x()x(μπ
μπγN
N
![Page 9: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/9.jpg)
EM Algorithm for GMM3. M step. Re-estimate the parameters using the current
responsibilities
=
== N
1nnj
N
1nnnj
j
)x(
x)x(
γ
γμ )x(
N1
n
N
1njj
=
= γπ( )( )
=
=
−−= N
1nnj
N
1n
Tjnjnnj
j
)x(
xx)x(
γ
μμγ
4. Evaluate log likelihood
If there is no convergence, return to step 2.
= =
=N
1n
K
1kkknk ),|x(Nln),,|X(pln μππμ
![Page 10: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/10.jpg)
EM Algorithm : Example
![Page 11: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/11.jpg)
EM Algorithm : Example
![Page 12: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/12.jpg)
EM Algorithm : Example
![Page 13: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/13.jpg)
EM Algorithm : Example
![Page 14: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/14.jpg)
EM Algorithm : Example
![Page 15: Gaussian Mixture Model (GMM) using Expectationvplab/courses/DVP/PDF/gmm.pdf · ML Method for estimating parameters Consider log of Gaussian Distribution ()x μ x μ 2 1 ln 2 1 ln(2](https://reader033.vdocuments.mx/reader033/viewer/2022052515/5a715ee07f8b9ab1538cc369/html5/thumbnails/15.jpg)
EM Algorithm : Example