learning spectral graph transformations for link prediction
DESCRIPTION
We present a unified framework for learning link prediction and edge weight prediction functions in large networks, based on the transformation of a graph’s algebraic spectrum. Our approach generalizes several graph kernels and dimensionality reduction methods and provides a method to estimate their parameters efficiently. We show howthe parameters of these prediction functions can be learned by reducing the problem to a one-dimensional regression problem whose runtime only depends on the method’s reduced rank and that can be inspected visually. We derive variants that apply to undirected, weighted, unweighted, unipartite and bipartite graphs. We evaluate our methodexperimentally using examples from social networks, collaborative filtering, trust networks, citation networks, authorship graphs and hyperlink networks.TRANSCRIPT
Learning Spectral Graph Transformations for Link Prediction
Jérôme Kunegis & Andreas LommatzschDAI-Labor, Technische Universität Berlin, Germany26th Int. Conf. on Machine Learning, Montréal 2009
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 1
Outline
The Problem– Link prediction– Known solutions
Learning – Spectral transformations– Finding the best spectral transformation
Variants– Weighted and signed graphs– Bipartite graphs and the SVD– Graph Laplacian and normalization
Some Applications
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 2
The Problem: Link Prediction
Motivation: Recommend connections in a social networkPredict links in an undirected, unweighted network
Using the adjacency matrices A and B,Find a function F(A) giving prediction values corresponding to B
F(A) = B
Known links (A)Links to be predicted (B)
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 3
Follow paths
Number of paths of length k given by Ak
Nodes connected by many pathsNodes connected by short paths
Weight powers of A: αA² + βA³ + γA⁴ … with α > β > γ … [ > 0 ]
Examples:
Exponential graph kernel: eαA = Σi αi/i! Ai
Von Neumann kernel: (I − αA)−1 = Σi αi Ai
(with 0 < α < 1)
Path Counting
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 4
Graph Laplacian L = D − A
“Resistance Distance” L+
(a.k.a. commute time)
Regularized Laplacian (I + αL)−1
Heat diffusion kernel e−αL
Laplacian Link Prediction Functions
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 5
Adjacency matrix
Matrix polynomial Σi αiAi
Matrix exponential eαA
Von Neumann kernel (I − αA)−1
Rank reduction A(k)
Graph Laplacian
Resistance distance L+
Regularized Laplacian (I + αL)−1
Heat diffusion kernel e−αL
Computation of Link Prediction Functions
eigenvalue decomposition: A = UΛUT
eigenvalue decomposition: L = D − A = UΛUT
= U (Σi αiΛi) UT
= U eαΛ UT
= U (I − αΛ)−1 UT
= U Λ+ UT
= U (I + αΛ)−1 UT
= U e−αΛ UT
Spectral transformation
= U Λ(k) UT
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 6
Learning Spectral Transformations
Link prediction functions are spectral transformations of A or L
F(A) = UF(Λ)UT
F(Λ)ii = f(Λii)
A spectral transformation F corresponds to a function of reals f
Matrix polynomial F(A) = Σi αiAi f(x) = Σi αixi Real polynomialMatrix exponential F(A) = eαA f(x) = eαx Real exponentialMatrix inverse F(A) = (I ± αA)−1 f(x) = 1 / (1 ± αx) Rational functionPseudoinverse F(A) = A+ f(x) = 1/x when x > 0, 0 otherwiseRank-k approximation F(A) = A(k) f(x) = x when |x| ≥ x0, 0 otherwise
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 7
Finding the Best Spectral Transformation
Find the best spectral transformation on test set BminF ‖F(A) − B‖F
Reduce minimization problem = minF ‖UF(Λ) UT − B‖F
= minF ‖F(Λ) − UTBU‖F
Reduce to diagonal, because off-diagonal in F(Λ) is constant zero
minf Σi (f(Λii) − (UTBU)ii)²
The best spectral transformation is given by a one-dimensional least-squares problem
(norm is preserved by U)
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 8
Example: DBLP Citation Network (undirected)
DBLP citation network
Symmetric adjacency matrices A = UΛUT, B
−30 −20 −10 0 10 20 30−1
0
1
2
3
4
5
6
Λii
(UTBU)ii
βeαx
αx³ + βx² + γx + δ
x when x < x0, else 0
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 9
Variants: Weighted and Signed Graphs
Weighted undirected graphs: use A and L = D − A as is
Signed graphs: use Dii = Σj |Aij| (signed graph Laplacian)
Example: Slashdot Zoo(social network with negative
edges)
−30 −20 −10 0 10 20 30 40−5
0
5
10
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 10
Bipartite Graphs
Bipartite graphs: paths have odd length
Compute sum of odd powers of AThe resulting polynomial is odd
αA³ + βA⁵ + …
For other link prediction functions, use the odd component
eαA sinh(αA)(I − αA)−1 αA (I − α²A²)−1
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 11
Singular Value Decomposition
Odd power of a bipartite graph’s adjacency matrix
A2n+1 = [0 R; RT 0]2n+1 = [0 (RRT)nR; RT(RRT)n 0]
Using the singular value decomposition R = UΣVT
(RRT)nR = (UΣVT VΣUT)n UΣVT = (UΣ²UT)n UΣVT = UΣ2n+1VT
Odd powers of A are given by odd spectral transformations of R
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 12
SVD Example: Bipartite Rating Graph
MovieLensrating graph
Rating values{−2, −1, 0, +1, +2}
0 20 40 60 80 100 120−5
0
5
10
15
20
25
30
35
αx⁵ + βx³ + γx
α x (x < x0)
β sinh(αx)
sgn(x) (α |x|²+β|x|+γ (x < x0))
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 13
Base Matrices
Base matrices A and L
Normalized variants: D−1/2AD−1/2, D−1/2LD−1/2 = I − D−1/2AD−1/2
Example: D−1/2AD−1/2
Trust network Advogato
Note: ignore eigenvalue 1(constant eigenvector)
αx² + βx + γ [α,β,γ≥0]
β / (1 – αx)
βeαx
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 14
Learning Laplacian Kernels
Epinions (signed user-user network), using L
Note: Λii > 0 becausethe graph is signed and unbalanced
β / (1 + αx)β e−αx
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 15
Almost Bipartite Networks
Dating site LíbímSeTi.cz: (users rate users)
Some networks are “almost” bipartitePlot has nearcentral symmetry
Bipartition: men/women
−800 −600 −400 −200 0 200 400 600 800−250
−200
−150
−100
−50
0
50
100
150
200
250
αx³ + βx² + γx + δ
β sinh(αx)
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 16
Learning the Reduced Rank k
Some plots suggest a reduced rank k
Example: MovieLens/SVD
Learned: k = 14
(UTBU)ii ≈ 0 significant sing� values
Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 17
Conclusion & Ongoing Research
ConclusionsMany link prediction functions are spectral transformationsSpectral transformations can be learned
Ongoing ResearchNew link prediction function: sinh(A), odd von Neumann (pseudo-)kernelSigned graph LaplacianOther matrix decompositionsOther normsMore datasets
Thank You