differential deep learning on graphs and its applications · odynamics on graphs are...
TRANSCRIPT
Differential Deep Learning on Graphs and its Applications
Chengxi Zang and Fei WangWeill Cornell Medicine
www.calvinzang.com
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 1
This Tutorialwww.calvinzang.com/DDLG_AAAI_2020.htmlAAAI-2020Friday, February 7, 2020, 2:00 PM -6:00 PMSutton North, Hilton New York Midtown, NYC
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 2
This TutorialMolecular Graph Generation: to generate novel molecules with
optimized propertiesoGraph generationoGraph property predictionoGraph optimizationLearning Dynamics on Graphs: to predict temporal change or final
states of complex systemsoContinuous-time dynamics predictionoStructured sequence predictionoNode classification/regressionMechanism Discovery: to find dynamical laws of complex systemsoDensity Estimation vs. Mechanism DiscoveryoData-driven discovery of differential equations
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 3
Part 2:Neural Dynamics on Complex
NetworksChengxi Zang and Fei Wang
Weill Cornell Medicinewww.calvinzang.com
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 4
Structures and Dynamics of Complex SystemsBrain and Bioelectrical flow Transportation and Traffic flow
Social Networks and Information flow Ecological Systems and Energy flow
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 5
Problem: Learning Dynamics of complex systemsBrain and Bioelectrical flow Transportation and Traffic flow
Social Networks and Information flow Ecological Systems and Energy flow
Dynamics? How to predict the temporal change of these complex systems?
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 6
Problem: Math FormulationLearning Dynamics on GraphoDynamics of nodes: 𝑋𝑋 𝑡𝑡 ∈ ℝ𝑛𝑛∗𝑑𝑑 at 𝑡𝑡, where 𝑛𝑛 is number of nodes, 𝑑𝑑 is number of features, 𝑋𝑋(𝑡𝑡) changes over continuous time t.oGraph: 𝐺𝐺 = (𝑉𝑉,𝐸𝐸) , V are nodes, 𝐸𝐸 are edges.oHow dynamics 𝑑𝑑𝑑𝑑(𝑡𝑡)
𝑑𝑑𝑡𝑡= 𝑓𝑓(𝑋𝑋 𝑡𝑡 ,𝐺𝐺,𝜃𝜃, 𝑡𝑡) change on graph?
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 7
Problem: Prediction TasksContinuous-time network dynamics prediction:oInput: G, �𝑋𝑋 𝑡𝑡1 , �𝑋𝑋 𝑡𝑡2 , … , �𝑋𝑋 𝑡𝑡𝑇𝑇 0 ≤ 𝑡𝑡1 < ⋯ < 𝑡𝑡𝑇𝑇 , 𝑡𝑡1 < ⋯ < 𝑡𝑡𝑇𝑇 are arbitrary time
momentso?A model of dynamics on graphs 𝑑𝑑𝑑𝑑(𝑡𝑡)
𝑑𝑑𝑡𝑡= 𝑓𝑓(𝑋𝑋 𝑡𝑡 ,𝐺𝐺,𝜃𝜃, 𝑡𝑡)
oOutput: to predict 𝑋𝑋 𝑡𝑡 at an arbitrary time moment
(Special case) Structured sequence predictionInput: G, �𝑋𝑋[1], �𝑋𝑋[2], … , �𝑋𝑋[𝑇𝑇] 0 ≤ 1 < ⋯ < 𝑇𝑇 , ordered sequence? A model of dynamics on graphs 𝑑𝑑𝑑𝑑(𝑡𝑡)
𝑑𝑑𝑡𝑡= 𝑓𝑓(𝑋𝑋 𝑡𝑡 ,𝐺𝐺,𝜃𝜃, 𝑡𝑡)
Output: to predict next k steps 𝑋𝑋 𝑇𝑇 + 𝑘𝑘(Special case) Node (semi-supervised) regression/classification
Input: G, �𝑋𝑋 = [ �𝑋𝑋,𝑀𝑀𝑀𝑀𝑀𝑀𝑘𝑘 ⊙ �𝑌𝑌] features and node labels, only one snapshot? A model of dynamics on graphs 𝑑𝑑𝑑𝑑(𝑡𝑡)
𝑑𝑑𝑡𝑡= 𝑓𝑓(𝑋𝑋 𝑡𝑡 ,𝐺𝐺,𝜃𝜃, 𝑡𝑡)
Output: to predict [𝑋𝑋,𝑌𝑌]
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 8
Problem: Prediction TasksContinuous-time network dynamics prediction:oInput: G, �𝑋𝑋 𝑡𝑡1 , �𝑋𝑋 𝑡𝑡2 , … , �𝑋𝑋 𝑡𝑡𝑇𝑇 0 ≤ 𝑡𝑡1 < ⋯ < 𝑡𝑡𝑇𝑇 , 𝑡𝑡1 < ⋯ < 𝑡𝑡𝑇𝑇 are arbitrary time
momentso?A model of dynamics on graphs 𝑑𝑑𝑑𝑑(𝑡𝑡)
𝑑𝑑𝑡𝑡= 𝑓𝑓(𝑋𝑋 𝑡𝑡 ,𝐺𝐺,𝜃𝜃, 𝑡𝑡)
oOutput: to predict 𝑋𝑋 𝑡𝑡 at an arbitrary time moment
(Special case) Structured sequence predictionoInput: G, �𝑋𝑋[1], �𝑋𝑋[2], … , �𝑋𝑋[𝑇𝑇] 0 ≤ 1 < ⋯ < 𝑇𝑇 , ordered sequenceo? A model of dynamics on graphs 𝑑𝑑𝑑𝑑(𝑡𝑡)
𝑑𝑑𝑡𝑡= 𝑓𝑓(𝑋𝑋 𝑡𝑡 ,𝐺𝐺,𝜃𝜃, 𝑡𝑡)
oOutput: to predict next k steps 𝑋𝑋 𝑇𝑇 + 𝑘𝑘 (Special case) Node (semi-supervised) regression/classificationoInput: G, �𝑋𝑋 = [ �𝑋𝑋,𝑀𝑀𝑀𝑀𝑀𝑀𝑘𝑘 ⊙ �𝑌𝑌] features and node labels, only one snapshoto? A model of dynamics on graphs 𝑑𝑑𝑑𝑑(𝑡𝑡)
𝑑𝑑𝑡𝑡= 𝑓𝑓(𝑋𝑋 𝑡𝑡 ,𝐺𝐺,𝜃𝜃, 𝑡𝑡)
oOutput: to predict [𝑋𝑋,𝑌𝑌]
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 9
Why Dynamics Matter?
To understand, predict, and control real-world dynamic systems in engineering and science.oBrain dynamics, traffic dynamics, social dynamics
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 10
Challenges: Dynamics of Complex SystemsComplex systems: oHigh-dimensionality and Complex interactionso≥ 100 nodes, ≥ 1000 interactions
Dynamics: oContinuous-time, Nonlinear
Structural-dynamic dependencies: oDifficult to be modeled by simple mechanistic models
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 12
Challenges: Dynamics of Complex Systems
+
+
+
Linear Dynamics
Linear Dynamics
Non-Linear Dynamics
𝒇𝒇(𝑿𝑿 𝒕𝒕 ,𝑮𝑮,𝜽𝜽, 𝒕𝒕)
?DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 13
Examples of dynamics on graphs
Related Works 1: Learning Continuous Time DynamicsTo learn continuous-time dynamicsoA clear knowledge of the mechanisms, small systems, few interaction terms,
first principle from physical laws, mechanistic models,
𝑭𝑭 =𝑑𝑑(𝑚𝑚𝒗𝒗)𝑑𝑑𝑡𝑡
𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴:𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴𝑴: 𝑪𝑪𝑪𝑪𝑴𝑴𝑴𝑴𝑪𝑪:
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 14
Data-driven Dynamics for Small SystemsData-driven discovery of ODEs/ PDEsoSparse RegressionoResidual NetworkoEtc.
Small systems!o<10 nodes & interactionsoCombinatorial complexityoNot for complex systems
Image from: Brunton et al. 2016. Discovering governing equations from data by sparse identification of nonlinear dynamical systems. PNAS
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 15
Related Works 2: Structured Sequence Learning
Defined characteristicsoDynamics on graphs are regularly-sampled with same time intervals
Temporal Graph Neural NetworksoRNN + CNNoRNN + GNNX[t+1]=LSTM(GCN([t], G))
Limitations: oOnly ordered sequence instead of continuous physical time
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 17
Seo et al. 2016. Structured Sequence Modeling with Graph Convolutional Recurrent Networks.Wu et al. 2019. A Comprehensive Survey on Graph Neural Networks
Related Works 3: Node (Semi-supervised) Classification/Regression
Defined characteristicsoOne-snapshot features and some labels on graphsoGoal: to assign labels to each nodeGraph Neural NetworksoGCN, oGAT, etc.Limitationso1 or 2 layersoLacking a continuous-time dynamics viewTo spread features or labels on graphsContinuous-time: more fine-grained control on diffusion
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 18
Kipf et al. 2016. Semi-Supervised Classification with Graph Convolutional NetworksVelickovic et al. 2017. Graph Attention Networks
Goal: A Unified Framework for All?Continuous-time network dynamics prediction:o Input: G, �𝑋𝑋 𝑡𝑡1 , �𝑋𝑋 𝑡𝑡2 , … , �𝑋𝑋 𝑡𝑡𝑇𝑇 0 ≤ 𝑡𝑡1 < ⋯ < 𝑡𝑡𝑇𝑇 , 𝑡𝑡1 < ⋯ < 𝑡𝑡𝑇𝑇 are arbitrary time momentso?Model: dynamics on graphs 𝒅𝒅𝑿𝑿(𝒕𝒕)
𝒅𝒅𝒕𝒕= 𝒇𝒇(𝑿𝑿 𝒕𝒕 ,𝑮𝑮,𝜽𝜽, 𝒕𝒕)
oOutput: to predict 𝑋𝑋 𝑡𝑡 at an arbitrary time moment
(Special case) Structured sequence predictionoInput: G, �𝑋𝑋[1], �𝑋𝑋[2], … , �𝑋𝑋[𝑇𝑇] 0 ≤ 1 < ⋯ < 𝑇𝑇 , ordered sequenceo? Model: dynamics on graphs 𝒅𝒅𝑿𝑿(𝒕𝒕)
𝒅𝒅𝒕𝒕= 𝒇𝒇(𝑿𝑿 𝒕𝒕 ,𝑮𝑮,𝜽𝜽, 𝒕𝒕)
oOutput: to predict next k steps 𝑋𝑋 𝑇𝑇 + 𝑘𝑘 (Special case) Node (semi-supervised) regression/classificationoInput: G, �𝑋𝑋 = [ �𝑋𝑋,𝑀𝑀𝑀𝑀𝑀𝑀𝑘𝑘 ⊙ �𝑌𝑌] features and node labels, only one snapshoto? A model of dynamics on graphs 𝒅𝒅𝑿𝑿(𝒕𝒕)
𝒅𝒅𝒕𝒕= 𝒇𝒇(𝑿𝑿 𝒕𝒕 ,𝑮𝑮,𝜽𝜽, 𝒕𝒕)
oOutput: to predict [𝑋𝑋,𝑌𝑌]
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 19
Our IdeasDifferential Equation Systems oGraphs and Differential Equations are general tools to describe structures and dynamics of complex systems
Deep LearningoRNN, GNN, Temporal GNN, Res-Net etc. are the state-of-the-art computational tools driven by data
How to leverage Differential equation systems and Deep Learning?
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 20
Neural Dynamics on Complex Networks (NDCN)Differential Deep LearningoDifferential Equation systems: 𝒅𝒅𝑿𝑿(𝒕𝒕)
𝒅𝒅𝒕𝒕= 𝒇𝒇 𝑿𝑿 𝐭𝐭 ,𝐆𝐆,𝑾𝑾, 𝐭𝐭
is a graph neural network like structure.oDifferential Deep model: 𝑋𝑋 𝒕𝒕 = 𝑿𝑿 𝟎𝟎 +∫𝟎𝟎𝒕𝒕 𝒇𝒇 𝑿𝑿 𝝉𝝉 ,𝐆𝐆,𝑾𝑾, 𝝉𝝉 𝒅𝒅𝝉𝝉 for arbitrary time 𝑡𝑡
oLearned as following optimization problem:
Our NDCN Model : Argmin 𝐿𝐿 = ∫0
𝑇𝑇 |𝑋𝑋 𝑡𝑡 − �𝑋𝑋(𝑡𝑡)|𝑑𝑑𝑡𝑡 S.t. 𝑋𝑋ℎ 𝑡𝑡 = 𝑡𝑡𝑀𝑀𝑛𝑛𝑡 𝑊𝑊 𝑡𝑡 𝑊𝑊𝑒𝑒 + 𝑏𝑏𝑒𝑒 𝑊𝑊0 + 𝑏𝑏0
𝑑𝑑𝑑𝑑ℎ(𝑡𝑡)𝑑𝑑𝑡𝑡
= 𝑅𝑅𝑅𝑅𝐿𝐿𝑅𝑅 𝜙𝜙𝑋𝑋ℎ 𝑡𝑡 𝑊𝑊 + 𝑏𝑏 ,𝑋𝑋ℎ 0
𝑋𝑋 𝑡𝑡 = 𝑋𝑋ℎ 𝑡𝑡 𝑊𝑊𝑑𝑑 + 𝑏𝑏𝑑𝑑
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 21
Neural Dynamics on Complex Networks (NDCN)Differential Deep LearningoDifferential Equation systems: 𝒅𝒅𝑿𝑿(𝒕𝒕)
𝒅𝒅𝒕𝒕= 𝒇𝒇 𝑿𝑿 𝐭𝐭 ,𝐆𝐆,𝑾𝑾, 𝐭𝐭
is a graph neural network like structure.oDifferential Deep model: 𝑋𝑋 𝒕𝒕 = 𝑿𝑿 𝟎𝟎 +∫𝟎𝟎𝒕𝒕 𝒇𝒇 𝑿𝑿 𝝉𝝉 ,𝐆𝐆,𝑾𝑾, 𝝉𝝉 𝒅𝒅𝝉𝝉 for arbitrary time 𝑡𝑡
oLearned as following optimization problem:
Our NDCN Model : Argmin 𝐿𝐿 = ∫0
𝑇𝑇 |𝑋𝑋 𝑡𝑡 − �𝑋𝑋(𝑡𝑡)|𝑑𝑑𝑡𝑡 S.t. 𝑋𝑋ℎ 𝑡𝑡 = 𝑡𝑡𝑀𝑀𝑛𝑛𝑡 𝑊𝑊 𝑡𝑡 𝑊𝑊𝑒𝑒 + 𝑏𝑏𝑒𝑒 𝑊𝑊0 + 𝑏𝑏0
𝑑𝑑𝑑𝑑ℎ(𝑡𝑡)𝑑𝑑𝑡𝑡
= 𝑅𝑅𝑅𝑅𝐿𝐿𝑅𝑅 𝜙𝜙𝑋𝑋ℎ 𝑡𝑡 𝑊𝑊 + 𝑏𝑏 ,𝑋𝑋ℎ 0
𝑋𝑋 𝑡𝑡 = 𝑋𝑋ℎ 𝑡𝑡 𝑊𝑊𝑑𝑑 + 𝑏𝑏𝑑𝑑
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 22
?
Interpretation from Residual LearningDeep Learning: 𝒇𝒇∗ is a neural layeroEach layer: 𝑋𝑋 𝑙𝑙 + 1 = 𝑓𝑓𝑙𝑙+1(𝑋𝑋[𝑙𝑙])oDeep Model: 𝑋𝑋 𝐿𝐿 = 𝑓𝑓𝐿𝐿 ∘ ⋯ ∘ 𝑓𝑓1 𝑋𝑋 0 ,
Residual Learning: deepoEach layer: X 𝑙𝑙 + 1 = X 𝑙𝑙 + fl+1 X loDeep Model: 𝑋𝑋 𝐿𝐿 = 𝑓𝑓𝐿𝐿 + 𝐼𝐼 ∘ ⋯ ∘ (𝑓𝑓1+𝐼𝐼)(𝑋𝑋[0])
Differential Deep Learning: oEach time moment (“layer”): Instantaneous rate at
t : dXdt
= f(X(t))Each Discrete layer vs. continuous time momentNeural mapping vs. Neural Differential Equation Systems
oContinuous-time (“Deep”) Model: X t = X 0 +∫0t f X τ , W, τ dτ Integration over continuous-timeA sequence of mappings vs. continuous integrationTrajectory of dynamics
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 23
Residual DE system NDCN
Interpretation from Graph Neural NetworksGNN, Residual-GNN, ODE-
GNN, NDCNoGNN: 𝑋𝑋𝑡𝑡+1 = 𝑓𝑓(𝐺𝐺,𝑋𝑋𝑡𝑡,𝜃𝜃𝑡𝑡)oResidual-GNN: 𝑋𝑋𝑡𝑡+1 = 𝑋𝑋𝑡𝑡 + 𝑓𝑓(𝐺𝐺,𝑋𝑋𝑡𝑡,𝜃𝜃𝑡𝑡)oDifferential-GNN: 𝑋𝑋𝑡𝑡+𝛿𝛿 = 𝑋𝑋𝑡𝑡 +𝛿𝛿𝑓𝑓 𝐺𝐺,𝑋𝑋𝑡𝑡,𝜃𝜃𝑡𝑡 ,𝛿𝛿 → 0𝑑𝑑𝑑𝑑𝑑𝑑𝑡𝑡
= 𝑓𝑓 𝐺𝐺,𝑋𝑋𝑡𝑡, 𝜃𝜃𝑡𝑡
Our model is an Differential-GNN with continuous layer with real number depth.
Residual Differential NDCN
Interpretation from RNN and Temporal GNNRNN, Temporal GNN and our modeloRNN or Temporal GNN𝑡𝑡𝑡 = 𝑓𝑓 𝑡𝑡𝑡−1, 𝑥𝑥𝑡𝑡 , 𝜃𝜃𝑡𝑡 or𝑡𝑡𝑡 = 𝑓𝑓 𝑡𝑡𝑡−1,𝐺𝐺 ∗ 𝑥𝑥𝑡𝑡 , 𝜃𝜃𝑡𝑡𝑦𝑦𝑡𝑡 = 𝑜𝑜 𝑡𝑡𝑡 ,𝑤𝑤𝑡𝑡oResidual RNN or Temporal GNN with skip connection𝑡𝑡𝑡 = 𝑡𝑡𝑡−1 + 𝑓𝑓 𝑡𝑡𝑡−1, 𝑥𝑥𝑡𝑡 ,𝜃𝜃𝑡𝑡 or 𝑡𝑡𝑡 = 𝑡𝑡𝑡−1 + 𝑓𝑓 𝑡𝑡𝑡−1,𝐺𝐺 ∗ 𝑥𝑥𝑡𝑡 ,𝜃𝜃𝑡𝑡𝑦𝑦𝑡𝑡 = 𝑜𝑜 𝑡𝑡𝑡 ,𝑤𝑤𝑡𝑡
oDifferential RNN or Differential GNN𝑑𝑑ℎ𝑡𝑡𝑑𝑑𝑡𝑡
= 𝑓𝑓 𝑡𝑡𝑡 , 𝑥𝑥𝑡𝑡 , 𝜃𝜃𝑡𝑡 or 𝑑𝑑ℎ𝑡𝑡𝑑𝑑𝑡𝑡
= 𝑓𝑓 𝑡𝑡𝑡 ,𝐺𝐺 ∗ 𝑥𝑥𝑡𝑡 , 𝜃𝜃𝑡𝑡𝑦𝑦𝑡𝑡 = 𝑜𝑜 𝑡𝑡𝑡 ,𝑤𝑤𝑡𝑡
Our model is an Differential GNNLearning continuous-time network dynamicsEncompassing Temporal GNN by discretizationEncompassing RNN by not using graph convolution
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 26
Exp1: Learning Continuous-time Network Dynamics
The Problem:oInput: �𝑋𝑋 𝑡𝑡1 , �𝑋𝑋 𝑡𝑡2 , … , �𝑋𝑋 𝑡𝑡𝑇𝑇 0 ≤ 𝑡𝑡1 < ⋯ < 𝑡𝑡𝑇𝑇 , 𝑡𝑡1 < ⋯ < 𝑡𝑡𝑇𝑇 are arbitrary time moments with different time intervalsoOutput: 𝑋𝑋 𝑡𝑡 , t is an arbitrary time momentinterpolation prediction: 𝑡𝑡 < 𝑡𝑡𝑇𝑇 and ≠ 𝑡𝑡1 < ⋯ < 𝑡𝑡𝑇𝑇extrapolation prediction: t > 𝑡𝑡𝑇𝑇
Setups:o120 irregularly sampled snapshots of network dynamicsoFirst 100: 80 for train 20 for testing interpolationoLast 20: testing for extrapolation
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 27
Real-world Dynamics on Graph (adjacency matrix A)oHeat diffusion: 𝑑𝑑𝑥𝑥𝑖𝑖(𝑡𝑡)
𝑑𝑑𝑡𝑡= −𝑘𝑘𝑖𝑖,𝑗𝑗 ∑𝑗𝑗=1𝑛𝑛 𝐴𝐴𝑖𝑖,𝑗𝑗(𝑥𝑥𝑖𝑖 𝑡𝑡 − 𝑥𝑥𝑗𝑗(𝑡𝑡))
oMutualistic interaction: 𝑑𝑑𝑥𝑥𝑖𝑖(𝑡𝑡)𝑑𝑑𝑡𝑡
= 𝑏𝑏𝑖𝑖 + 𝑥𝑥𝑖𝑖 𝑡𝑡 1 − 𝑥𝑥𝑖𝑖 𝑡𝑡𝑘𝑘𝑖𝑖
𝑥𝑥𝑖𝑖 𝑡𝑡𝑐𝑐𝑖𝑖
− 1 +∑𝑗𝑗=1𝑛𝑛 𝐴𝐴𝑖𝑖,𝑗𝑗
𝑥𝑥𝑖𝑖 𝑡𝑡 ∗𝑥𝑥𝑗𝑗(𝑡𝑡)
𝑑𝑑𝑖𝑖+𝑒𝑒𝑖𝑖𝑥𝑥𝑖𝑖 𝑡𝑡 +ℎ𝑗𝑗𝑥𝑥𝑗𝑗(𝑡𝑡)
oGene regulatory: 𝑑𝑑𝑥𝑥𝑖𝑖(𝑡𝑡)𝑑𝑑𝑡𝑡
= −𝑏𝑏𝑖𝑖𝑥𝑥𝑖𝑖 𝑡𝑡 𝑓𝑓 + ∑𝑗𝑗=1𝑛𝑛 𝐴𝐴𝑖𝑖,𝑗𝑗𝑥𝑥𝑗𝑗 𝑡𝑡 ℎ
𝑥𝑥𝑗𝑗 𝑡𝑡 ℎ+1
GraphsoGrid, Random, power-law, small-world, community, etc.Visualizing dynamics on graphoNodes are numbered by community labels oMapped into a ℕ2 grido𝑋𝑋(𝑡𝑡)𝑛𝑛∗1: ℕ2 ℝ
Canonical Dynamics on Graphs in Physics and Biology
0
1 2 3 01 3
2 01 3
2DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 28
Exp1: Learning Continuous-time Network Dynamics
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 29
Differential NDCN
Chen et al. 2019. Neural Ordinary Differential Equations. NeurIPS.
Baselines: ablation modelsoDifferential-GNNNo encoding layeroNeural ODE NetworkNo graph diffusionoNDCN without control parameter 𝑊𝑊Determined dynamics
Exp1: Heat Diffusion on Different Graphs
30
Exp1: Mutualistic Dynamics on Different Graphs
31
Exp1: Gene Dynamics on Different Graphs
32
Exp1: Results for Continuous-time Extrapolation
Mean Absolute Percentage Error20 runs for 3 dynamics on 5 graphsOur model achieves lowest error
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 33
Exp1: Results for Continuous-time Interpolation
Interpolation is easier than extrapolationOur model achieves lowest error
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 34
Exp2: Structured Sequence PredictionThe Problem (Structured sequence prediction):oInput: �𝑋𝑋[1], �𝑋𝑋[2], … , �𝑋𝑋[𝑇𝑇] 0 ≤ 1 < ⋯ < 𝑇𝑇 , 1, . .𝑇𝑇 are regularly-sampled with same time intervals with an emphasis on ordered sequence rather than timeoOutput: 𝑋𝑋 𝑡𝑡𝑇𝑇 + 𝑀𝑀 , next M stepsextrapolation prediction
Setups:o100 regularly sampled snapshots of network dynamicsoFirst 80 for training, last 20 for testing
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 35
Exp2: Structured Sequence PredictionBaselines: temporal-GNN modelsoLSTM-GNNX[t+1]=LSTM(GCN([t], G))oGRU-GNNX[t+1]=GRU(GCN([t], G))oRNN-GNNX[t+1]=RNN(GCN([t], G))
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 36
Seo et al. 2016. Structured Sequence Modeling with Graph Convolutional Recurrent Networks.Wu et al. 2019. A Comprehensive Survey on Graph Neural Networks
Exp2: Structured Sequence PredictionResults: oOur model achieves lowest error with much less parameters
The learnable parameters:oLSTM-GNN: 84,890, GRU-GNN: 64,770, RNN-GNN: 24,530oNDCN: 901
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 37
Exp3. Node Semi-suprvised Classification
The Problem:oOne-snapshot caseoInput: G, 𝑋𝑋, part of labels 𝑌𝑌(𝑋𝑋)oOutput: To Complete 𝑌𝑌(𝑋𝑋)
Datasets:o
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 38
Exp3. Node Semi-suprvised Classification
BaselinesoGraph Convolution Network (GCN)oAttention-based GNN (AGNN)oGraph Attention Networks (GAT)
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 39
Kipf et al. 2016. Semi-Supervised Classification with Graph Convolutional NetworksVelickovic et al. 2017. Graph Attention Networks
Exp3. Node Semi-suprvised Classification
Interpretation of modeloInput: G, [𝑋𝑋,𝑀𝑀𝑀𝑀𝑀𝑀𝑘𝑘⨀𝑌𝑌], features and some node labelsoOutput: To Complete 𝑌𝑌oModel: A graph dynamics to spread features and labels over time T𝑑𝑑[𝑑𝑑,𝑌𝑌]𝑑𝑑𝑡𝑡
= f(G, X, Y, W)
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 40
Exp3. Node Semi-suprvised Classification
MetricsoAccuracy over 100 runs
ResultsoContinuous-time dynamics on graphsoBest results at time T=1.2Continuous depth/timeoNot using dropout
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 41
SummaryOur NDCN, a unified framework to solveoContinuous-time network dynamics prediction:oStructured sequence predictionoNode regression/classification at final stategood performance with less parameters.
Differential Deep Learning on GraphsoA potential data-driven method to model structure and dynamics of complex systems in a unified framework
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 42
This TutorialMolecular Graph Generation: to generate novel molecules with
optimized propertiesoGraph generationoGraph property predictionoGraph optimizationLearning Dynamics on Graphs: to predict temporal change or final
states of complex systemsoContinuous-time dynamics predictionoStructured sequence predictionoNode classification/regressionMechanism Discovery: to find dynamical laws of complex systemsoDensity Estimation vs. Mechanism DiscoveryoData-driven discovery of differential equations
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 43
This Tutorialwww.calvinzang.com/DDLG_AAAI_2020.htmlAAAI-2020Friday, February 7, 2020, 2:00 PM -6:00 PMSutton North, Hilton New York Midtown, NYC
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 44
Differential Deep Learning on Graphs and its Applications
Chengxi Zang and Fei WangWeill Cornell Medicinewww.calvinzang.com
DIFFERENTIAL DEEP LEARNING ON GRAPHS AND ITS APPLICATIONS --- AAAI-2020 45
Thank You!