proposes functions to measure gestalt features of shapes adapts [zhu, wu mumford] frame method to...

34
Embedding Gestalt Laws Embedding Gestalt Laws in Markov Random Fields in Markov Random Fields by Song-Chun Zhu by Song-Chun Zhu

Post on 20-Dec-2015

216 views

Category:

Documents


1 download

TRANSCRIPT

Embedding Gestalt LawsEmbedding Gestalt Laws

in Markov Random Fieldsin Markov Random Fields

by Song-Chun Zhuby Song-Chun Zhu

Purpose of the PaperPurpose of the Paper

Proposes functions to measure Gestalt features of shapes

Adapts [Zhu, Wu Mumford] FRAME method to shapes

Exhibits effect of MRF model obtained by putting these together.

Recall Recall GestaltGestalt Features Features(à la [Lowe], and others)

Colinearity

Cocircularity

Proximity

Parallelism

Symmetry

Continuity

Closure

Familiarity

FRAMEFRAME[Zhu, Wu, Mumford]

F ilters

R andom fields

A nd

M aximum

E ntropy

A general procedure for constructing MRF models

Three Main PartsThree Main Parts

Data

Learn MRF models from data

Test generative power of learned model

Elements of DataElements of Data

A set of images representative of the chosen application domain

An adequate collection of feature measures or filters

The (marginal) statistics of applying the feature measures or filters to the set of images

Data: ImagesData: Images

Zhu considers 22 animal shapes and their horizontal flips

The resulting histograms are symmetric

More data can be obtainedBut are there other effects?

Sample Animate ImagesSample Animate Images

Contour-based Feature MeasuresContour-based Feature MeasuresGoal is to be generic

But generic shape features are hard to find

φ1 = κ(s), the curvature

κ(s) = 0 implies the linelets on either side of Γ(s) are colinear

φ2 = κ'(s), its derivative

κ'(s) = 0 implies three sequential linelets are cocircular

“Other contour-based shape filters can be defined in the same way”

Zhu's Symmetry FunctionZhu's Symmetry Function

Ψ(s) pairs linelets across medial axesDefined and computed by minimizing an energy functional constructed so that

Paired linelets are as close, parallel and symmetric as possible, and

There are as few discontinuities as possible

Region-based Feature MeasuresRegion-based Feature Measures

φ3(s) = dist(s, ψ(s))

Measures proximity of paired linelets across a region

φ4(s) = φ3'(s), the derivative

φ4(s) = 0 implies paired linelets are parallel

φ5(s) = φ'4(s) = φ3''(s)

φ5(s) = 0 implies paired linelets are symmetric

Another Possible Shape FeatureAnother Possible Shape Feature

φ6(s) = 1 where ψ(s) is discontinuous

0 otherwise

Counts the number of “parts” a shape has

Can Gestalt “familiarity” be (statistically?) measured?

The StatisticThe Statistic

The histogram of feature φ over curve Γ is

H(z; φk, Γ) = ∫δ(z-φk(s)) ds

δ is the Dirac function: mass 1 at 0, and 0 otherwise

μ(z; φk) denotes the average over all images

Zhu claims μ is a close estimation of the marginal distribution of the “true distribution” over shape space, assuming the total number of linelets is small.

Statistical ObservationsStatistical Observations

φ1 at scales 0, 1, 2

φ3 φ4 φ5

On 22 images and their flips

Construct a ModelConstruct a Model

Ω is the space of shapes

Φ is a finite subset of feature filters

We seek a probability distribution p on Ω

∫Ω p(Γ) dΓ = 1 (1)

That reproduces the statistics for all φ in Ω

∫Ω p(Γ) δ(z-φ(s)) dΓ = μ(z; φ) (2)

Construct a Model, 2Construct a Model, 2Idea: Choose the p with maximal entropy

Seems reasonable and fair, but is it really the best target/energy function?

Lagrange multipliers and calculus of variations lead to

p(Γ; Φ, Λ) = exp(–∑φЄΦ ∫ λφ(z) H(φ, Γ, z) dz) / Zwhere Z is the usual normalizing factor

Λ = { λφ | φЄΦ }

It's a Gibbs DistributionIt's a Gibbs Distribution

In other words, it has the form of a Gibbs distribution, and therefore determines a Markov Random Field (MRF) model.

Markov Chain Monte CarloMarkov Chain Monte Carlo

Too hard to compute λ's and p analytically

Idea: Sample Ω according to the distribution p, stochastically update Λ to update p, and repeat until p reproduces all μ(z; φ) for φ Є Φ

Monte Carlo because of random walk

Markov Chain in the nature of the loop

Markov Chain Monte Carlo, 2Markov Chain Monte Carlo, 2

From the sampling produce μ'(z; φ)Same as μ(z; φ) except based on a random sample of shape space

For the purposes of today's discussion, the details are not important

For φ Є Φ

μ'(z; φ) = μ(z; φ)

Zhu et al. assume there exists a “true underlying distribution”

The Nonaccidental StatisticThe Nonaccidental Statistic

For φ' not in the set Φ we expect

μ'(z; φ') ≠ μ(z; φ')

μ'(z; φ') is the accidental statistic for φ'It is a measure of correlation between φ' and Φ

The “distance” (L1, L2, or other) between μ'(z; φ') and μ(z; φ') is the nonaccidental statistic for φ'

It is a measure of how much “additional information” φ' carries above what is already in Φ

The Algorithm (simplified)The Algorithm (simplified)

Enter your set Γ = { γ } of shapes

Enter a (large) set { φ } of candidate feature measures

Compute μ(φ, Γ) for all φ in Φ

Compute μ'(φ) relative to a uniform distribution on Ω

Until the nonaccidental statistic of all unused features is small enough, repeat:

Algorithm, 2Algorithm, 2

Of the remaining φ , add to Φ one with maximal nonaccidental statistic

Update:Set of Lagrange multipliers Λ = { λ }

Probability distribution model p(Φ, Λ)

The μ'(φ) for remaining candidate features φ

Experiments and DiscussionExperiments and Discussion

Let my description of these experiments stimulate your thoughts on such issues as

Are there better Gestalt feature measures?

What is the best possible outcome of a generative model of shape?

What feature measures should be added to the Gestalt ones?

How useful were these experiments and what other might be worth doing?

Experiment 1Experiment 1

When the only feature used is the curvature the model generated

Experiment 1, continuedExperiment 1, continued

A Gaussian model (with the same κ-variance) produced

Experiment 2Experiment 2

Experiment 2 uses both κ and κ'

The nonaccidental statistic of κ' with respect to the model based on κ can be seen here

Experiment 2, continuedExperiment 2, continued

This time the model generated these shapes, purported to be smoother and more scale invariant

Experiment 3Experiment 3

The nonaccidental statistics of the three region-based shape features relative to the model produced in Experiment 2

Experiment 3, continuedExperiment 3, continued

So r'' was omitted, this model has

Φ = { κ, κ', r, r' }

Experiment 3, continuedExperiment 3, continued

This model produced such shapes as

Concluding DiscussionConcluding Discussion

Zhu acknowledges that the selection of training shapes might introduce a bias; but

Discussion, continuedDiscussion, continued

Zhu acknowledges that the paucity of Gestalt features limits the possible neighborhood structures used to define a MRF.

Zhu acknowledges that these models do not account for high-level shape properties, and suggests that a composition system might address this problem.

Questions and CommentsQuestions and Comments

Although it is in the nature of an MRF-model to propagate local properties, I think there needs to be a higher-level basis (than linelets) for measuring the Gestalt features of a shape!

Are there better Gestalt feature measures?

What feature measures should be added to the Gestalt ones?

More Questions for DiscussionMore Questions for Discussion

What is the best possible outcome of a generative model of shape? Is such a thing worth pursuing?

How useful were Zhu' experiments and what others might be worth doing?