graph structure, complexity and learning•spectral or algebraic: use eigenvalues of adjacency...

94
Graph Structure, Complexity and Learning Edwin Hancock Department of Computer Science University of York Supported by a Royal Society Wolfson Research Merit Award

Upload: others

Post on 26-Jul-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Graph Structure, Complexity

and Learning

Edwin Hancock

Department of Computer Science

University of York

Supported by a Royal Society

Wolfson Research Merit Award

Page 2: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Royal Society

• Chicheley Hall meeting May 2012

• Video proceedings: http://videolectures.net/complexnetworks2012_london/

Page 3: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Structural Variations

Page 4: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Protein-Protein Interaction Networks

Page 5: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Graph data

• Problems based on graphs arise in areas such as

language processing, proteomics/chemoinformatics,

data mining, computer vision and complex systems.

• Relatively little methodology available, and vectorial

methods from statistical machine learning not easily

applied since there is no canonical ordering of the nodes

in a graph.

• Can make considerable progress if we develop

permutation invariant characterisations of variations in

graph structure.

Page 6: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Graph data

• Problems based on graphs arise in areas such as

language processing, proteomics/chemoinformatics,

data mining, computer vision and complex systems.

• Relatively little methodology available, and vectorial

methods from statistical machine learning not easily

applied since there is no canonical ordering of the nodes

in a graph.

• Can make considerable progress if we develop

permutation invariant characterisations of variations in

graph structure.

Page 7: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Graph data

• Problems based on graphs arise in areas such as

language processing, proteomics/chemoinformatics,

data mining, computer vision and complex systems.

• Relatively little methodology available, and vectorial

methods from statistical machine learning not easily

applied since there is no canonical ordering of the nodes

in a graph.

• Can make considerable progress if we develop

permutation invariant characterisations of variations in

graph structure.

Page 8: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Characterising graphs

• Topological: e.g. average degree, degree distribution, edge-density, diameter, cycle frequencies etc.

• Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial.

• Complexity: use information theoretic measures of structure (e.g. Shannon entropy).

Page 9: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Complex systems

• Spatial and topological indices: node degree

stats; edge density;

• Communicability: communities, measures of

centrality, separation, etc. (Baribasi, Watts and

Strogatz, Estrada).

• Processes on graphs: Markov process, Ising

models, random walks, searchability (Kleinberg).

Page 10: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Links explored in this talk

• Structure: discriminate between graphs on the

basis of their detailed structure.

• Complexity: determine whether different non-

isomorphic structures are if similar or different

intrinsic complexity.

• Learning: learn generative model of structure

that gives minimum complexity description of

training data (MDL).

Page 11: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Structure

Page 12: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Graph spectra and

random walks

Use spectrum of Laplacian matrix

to compute hitting and commute

times for random walk on a graph

Page 13: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Laplacian Matrix

• Weighted adjacency matrix

• Degree matrix

• Laplacian matrix

otherwise

EvuvuwvuW

0

),(),(),(

Vv

vuWuuD ),(),(

WDL

Page 14: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Laplacian spectrum

• Spectral Decomposition of Laplacian

• Element-wise

)()(),( vuvuL kk

k

k

T

k

k

kk

TL

),....,( ||1 Vdiag )|.....|( ||1 V

||21 ....0 V

Page 15: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Properties of the Laplacian

• Eigenvalues are positive and smallest eigenvalue is zero

• Multiplicity of zero eigenvalue is number connected components of graph.

• Zero eigenvalue is associated with all-ones vector.

• Eigenvector associated with the second smallest eigenvector is Fiedler vector.

||21 .....0 V

Page 16: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Continuous time random walk

Page 17: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Heat Kernels

• Solution of heat equation and measures

information flow across edges of graph

with time:

• Solution found by exponentiating

Laplacian eigensystem

tt Lht

h

TT

kk

k

kt tth ]exp[]exp[

TWDL

Page 18: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Heat kernel and random walk

• State vector of continuous time random

walk satisfies the differential equation

• Solution

tt Lp

t

p

00]exp[ phpLtp tt

Page 19: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Example.

Graph shows spanning tree of heat-kernel. Here weights of graph are

elements of heat kernel. As t increases, then spanning tree evolves from

a tree rooted near centre of graph to a string (with ligatures).

Low t behaviour dominated by Laplacian, high t behaviour dominated by

Fiedler-vector.

Page 20: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Moments of the heat-kernel

trace

….can we characterise graph by

the shape of its heat-kernel trace

function?

Page 21: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Heat Kernel Trace

Time (t)->

Trace

]exp[][ thTri

it

Shape of heat-kernel

distinguishes

graphs…can we

characterise its shape

using moments

Page 22: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Heat Kernel Trace

Time (t)->

Trace

]exp[][ thTri

it

Shape of heat-kernel

distinguishes

graphs…can we

characterise its shape

using moments

dtthTrts s )]([)(0

1

Use moments of heat kernel

trace:

Page 23: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Rosenberg Zeta function

• Definition of zeta function

s

k

k

s

)()(0

Page 24: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Heat-kernel moments

• Mellin transform

• Trace and number of connected components

• Zeta function

dttts

i

ss

i ]exp[)(

1

0

1

dttts s ]exp[)(0

1

]exp[][0

tChTri

it

dtChTrts

s t

ss

i

i

][)(

1)(

0

1

0

C is multiplicity of zero

eigenvalue or number of

connected components in

graph.

Zeta-function is related

to moments of heat-

kernel trace.

Page 25: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Zeta-function behavior

Page 26: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Objects

72 views of each object taken in 5 degree intervals as camera moves

in circle around object.

Feature points extracted using corner detector.

Construct Voronoi tesselation image plane using corner points as

seeds.

Delaunay graph is region adjacency graph for Voronoi regions.

Page 27: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Heat kernel moments

(zeta(s), s=1,2,3,4)

Page 28: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

PCA using zeta(s), s=1,2,3,4)

Page 29: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Zeta function derivative

• Zeta function in terms of natural exponential

• Derivative

• Derivative at origin

]lnexp[)()(00

kk

k

s

k ss

]lnexp[ln)('0

k

kk ss

0

0

1lnln)0('

k

k k

k

Page 30: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Meaning

• Number of spanning trees in graph

)](exp[)( ' od

d

G

Vu

u

Vu

u

Page 31: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

COIL

Page 32: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Deeper probes of structure

Ihara zeta function

Page 33: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Zeta functions

• Used in number theory to characterise

distribution of prime numbers.

• Can be extended to graphs by replacing

notion of prime number with that of a

prime cycle.

Page 34: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Ihara Zeta function

• Determined by distribution of prime cycles.

• Transform graph to oriented line graph (OLG) with edges as nodes and edges indicating incidence at a common vertex.

• Zeta function is reciprocal of characteristic polynomial for OLG adjacency matrix.

• Coefficients of polynomial determined by eigenvalues of OLG adjacency matrix.

• Powers of OLG adjacency matrix give prime cycle frequencies.

Page 35: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Oriented Line Graph

Page 36: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Ihara Zeta Function

• Ihara Zeta Function for a graph G(V,E)

– Defined over prime cycles of graph

– Rational expression in terms of characteristic polynomial of oriented line-graph

A is adjacency matrix of line digraph

Q =D-I (degree matrix minus identity

Page 37: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Characteristic Polynomials from IZF

• Perron-Frobenius operator is the adjacency matrix TH

of the oriented line graph

• Determinant Expression of IZF

– Each coefficient,i.e. Ihara coefficient, can be derived from the

elementary symmetric polynomials of the eigenvalue set

• Pattern Vector in terms of

Page 38: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Analysis of determinant

• From matrix logs

• Tr[T^k] is symmetric polynomial of

eigenvalues of T

]][exp[]det[

1)(

1 k

sTTr

TsIs

k

k

k

N

N

N

N

TTr

TTr

TTr

.............][

.....

...][

........][

21

2

21

2

1

2

1

1

Page 39: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Distribution of prime cycles

• Frequency distribution for cycles of length l

• Cycle frequencies

l

l

lsNsds

ds )(ln

][)(ln)!1(

10

l

sl

l

l TTrsds

d

lN

Page 40: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Experiments: Edge-weighted Graphs

Feature Distance

& Edit Distance

Three Classes of Randomly

Generated Graphs

Page 41: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Experiments: Hypergraphs

Page 42: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Complexity

Information theory, graphs and

kernels.

Page 43: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Protein-Protein Interaction Networks

Page 44: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Characterising graphs

• Topological: e.g. average degree, degree distribution, edge-density, diameter, cycle frequencies etc.

• Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial.

• Complexity: use information theoretic measures of structure (e.g. Shannon entropy).

Page 45: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Complexity characterisation

• Information theory: entropy measures

• Structural pattern recognition: graph

spectral indices of structure and topology.

• Complex systems: measures of centrality,

separation, searchability.

Page 46: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Information theory

• Entropic measures of complexity:

Shannon , Erdos-Renyi, Von-Neumann.

• Description length: fitting of models to

data, entropy (model cost) tensioned

against log-likelihood (goodness of fit).

• Kernels: Use entropy to computeJensen-

Shannon divergence

Page 47: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Recent work

• Escolano, Hancock, Lozano: Phys. Rev.

E. (Thermodynamic depth complexity

using Bregman balls to compute entropy).

• Han, Wilson and Hancock: Patt. Rec. Lett.

(fast entropy computation using

approximation to von Neumann entropy).

Page 48: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Thermodynamic Depth Complexity

• Simulate heat flow on graph using continuous time

random walk.

• Characterise nodes by their thermodynamic depth (time

walk takes to reach node).

• Measure heat flow dependence at each node with time.

Record maximum.

• Compute homogeneity statistics over thermodynamic

depth.

Page 49: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Phase transition

• As time evolves complexity undergoes

phase transition.

• Corresponds to maximum flow at a node.

• Maximum of entropy.

Page 50: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Von-Neumann Entropy

• Derived from normalised Laplacian

spectrum

• Comes from quantum mechanics and is

entropy associated with density matrix.

2

ˆln

2

ˆ||

1

i

V

i

iVNH

TDADDL ˆˆˆ)(ˆ 2/12/1

Page 51: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Approximation

• Quadratic entropy

• In terms of matrix traces

||

1

2||

1

||

1

ˆ4

1ˆ2

1

2

ˆ1

2

ˆ V

i

i

V

i

ii

V

i

iVNH

]ˆ[4

1]ˆ[

2

1 2LTrLTrHVN

Page 52: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Computing Traces

• Normalised Laplacian

• Normalised Laplacian squared

||]ˆ[ VLTr

Evu vudd

VLTr),(

2

4

1||]ˆ[

Page 53: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Simplified entropy

Evu vu

VNdd

VH),( 4

1||

4

1

Collect terms together, von Neumann

entropy reduces to

Page 54: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Homogeneity index

Evuvuvu

Evu

vu

ddddVVG

ddG

),(

2

),(

2/12/1

211

1||2||

1)(

)()(

Based on degree

statistics

Page 55: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Homogeneity meaning

Evu

vuAvuCTG),(

),(2),(~)(

Limit of large degree

Largest when commute time differs from 2

due to large number of alternative

connecting paths.

Page 56: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Directed Graphs

Ejiin

j

out

i

out

j

in

j

out

i

in

i

dd

ddddVVH

),(

2//

2/1/11

Von Neumann entropy comes from in-degree

and out-degree of vertices connected by edges

Development comes from Laplacian of a directed graph (Chung).

Page 57: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Uses

• Complexity-based clustering (especially

protein-protein interaction networks).

• Defining information theoretic (Jensen-

Shannon) kernels.

• Controlling complexity of generative

models of graphs.

Page 58: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Protein-Protein Interaction Networks

Page 59: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Experiment

Page 60: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial
Page 61: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial
Page 62: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Jensen-Shannon Kernel

• Defined in terms of J-S divergence

• Properties: extensive, positive.

)()()(),(

),(2ln),(

jijiji

jijiJS

GHGHGGHGGJS

GGJSGGK

Page 63: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Computation

• Construct direct product graph for each

graph pair.

• Compute von-Neumann entropy difference

between product graph and two graphs

individually.

• Construct kernel matrix over all pairs.

Page 64: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial
Page 65: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial
Page 66: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Learning

Page 67: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Generative Models

• Structural domain: define probability distribution over

prototype structure. Prototype together with parameters

of distribution minimise description length (Torsello and

Hancock, PAMI 2007) .

• Spectral domain: embed nodes of graphs into vector-

space using spectral decomposition. Construct point

distribution model over embedded positions of nodes

(Bai, Wilson and Hancock, CVIU 2009).

Page 68: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Deep learning

• Deep belief networks (Hinton 2006, Bengio 2007).

• Compositional networks (Amit+Geman 1999, Fergus

2010).

• Markov models (Leonardis 200

• Stochastic image grammars (Zhu, Mumford, Yuille)

• Taxonomy/category learning (Todorovic+Ahuja, 2006-

2008).

Page 69: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Aim

• Combine spectral and structural methods.

• Use description length criterion.

• Apply to graphs rather than trees.

Page 70: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Prior work

• IJCV 2007 (Torsello, Robles-Kelly, Hancock) –shape classes from edit distance using pairwise clustering.

• PAMI 06 and Pattern Recognition 05 (Wilson, Luo and Hancock) – graph clustering using spectral features and polynomials.

• PAMI 07 (Torsello and Hancock) – generative model for variations in tree structure using description length.

• CVIU09 (Xiao, Wilson and Hancock) – generative model from heat-kernel embedding of graphs.

Page 71: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Structural learning

Using description length

Page 72: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Description length

• Wallace+Freeman: minimum message

length.

• Rissanen: minimum description length.

Use log-posterior probability to locate model that is

optimal with respect to code-length.

Page 73: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Similarities/differences

• MDL: selection of model is aim; model

parameters are simply a means to this

end. Parameters usually maximum

likelihood. Prior on parameters is flat.

• MML: Recovery of model parameters is

central. Parameter prior may be more

complex.

Page 74: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Coding scheme

• Usually assumed to follow an exponential

distribution.

• Alternatives are universal codes and predictive

codes.

• MML has two part codes (model+parameters). In

MDL the codes may be one or two-part.

Page 75: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Method

• Model is supergraph (i.e. Graph prototypes) formed by graph union.

• Sample data observation model: Bernoulli distribution over nodes and edges.

• Mode: complexity: Von-Neumann entropy of supergraphs.

• Fitting criterion:

MDL-like-make ML estimates of the Bernoulli parameters

MML-like: two-part code for data-model fit + supergraph complexity.

Page 76: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Model overview

• Description length criterion

code-length=negative + model code-length

log-likelihood (entropy)

Data-set: set of graphs G

Model: prototype graph+correspondences with it

Updates by expectation maximisation:

Model graph adjacency matrix (M-step)

+ correspondence indicators (E-step).

)()|(),( HGLLGL

Page 77: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Learn supergraph using MDL

• Follow Torsello and Hancock and pose the problem of learning generative model for graphs as that of learning a supergraph representation.

• Required probability distributions is an extension of model developed by Luo and Hancock.

• Use von Neumann entropy to control supergraph’s complexity.

• Develop an EM algorithm in which the node correspondences and the supergraph edge probability matrix are treated as missing data.

Page 78: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Probabilistic Framework

V1

V4

V3V2

V1 V2 V3 V4

0

A=

011

1 0 1

110

0

0

1

001 V1

V2

V3

V4

Here the structure of the sample graphs and the supergraph are

represented by their adjacency matrices

Page 79: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Given a sample graph and a supergraph

along with their assignment matrix,

the a posteriori probabilities of the sample graphs given the

structure of the supergraph and the node correspondences is

defined as

Observation model

Page 80: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Data code-length

• For the sample graph-set and the supergraph ,

the set of assignment is . Under the assumption

that the graphs in are independent samples from the distribution,

the likelihood of the sample graphs can be written

• Code length of observed data

Page 81: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Overall code-length

According to Rissanen and Grunwald’s minimum description length

criterion, we encode and transmit the sample graphs and the

supergraph structure. This leads to a two-part message whose total

length is given

We consider both the node correspondence information between

graphs S and the structure of the supergraph M as missing data and

locate M by minimizing the overall code-length using EM algorithm.

Page 82: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

EM – code-length criterion

Page 83: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Expectation + Maximization • M-step :

Recover correspondence matrices: Take partial derivative of the weighted log-likelihood function and soft assign.

Modify supergraph structure :

• E-step: Compute the a posteriori probability of the nodes in the sample graphs being matching to those of the supergraph.

Page 84: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Experiments

Delaunay graphs from images of different objects.

COIL dataset Toys dataset

Page 85: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Experiments---validation COIL dataset: model complexity increase, graph data log-likelihood

increase, overall code length decrease during iterations.

Toys dataset: model complexity decrease, graph data log-likelihood

increase, overall code length decrease during iterations.

Page 86: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Experiments---classification task

We compare the performance of our learned

supergraph on classification task with two alternative

constructions , the median graph and the supergraph

learned without using MDL. The table below shows

the average classification rates from 10-fold cross

validation, which are followed by their standard

errors.

Page 87: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Experiments---graph embedding

Pairwise graph distance based on the

Jensen-Shannon divergence and the von

Neumann entropy of graphs

Page 88: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Experiments---graph embedding

Edit distance JSD distance

Page 89: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Generative model

• Train on graphs with set of predetermined

characteristics.

• Sample using Monte-Carlo.

• Reproduces characteristics of training set,

e.g. Spectral gap, node degree

distribution, etc.

Page 90: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Erdos Renyi

Page 91: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Barabasi Albert (scale free)

Page 92: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Dealunay Graphs

Page 93: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Experiments---generate new samples

Page 94: Graph Structure, Complexity and Learning•Spectral or algebraic: use eigenvalues of adjacency matrix or Laplacian, or equivalently the co-efficients of characteristic polynomial

Conclusions

• Shown how graph spectra can be used as

characterisations of both structure and

complexity.

• Presented MDL framework which uses

complexity characterisation to learn generative

model of graph structure.

• Future: Deeper measures of structure

(symmetry) and detailed dynamics of network

evolution.