Reformulated Neural Network (ReNN)
A New Alternative for Data-driven Modelling in Hydrology and Water Resources Engineering
Saman Razavi1, Bryan Tolson1, Donald Burn1, and Frank Seglenieks2
1 Department of Civil and Environmental Engineering, University of Waterloo, Waterloo, Ontario, Canada 2 Environment Canada, Burlington, Ontario, Canada
Outline of the Presentation
Introduction to Reformulated Neural Network
Application 1 – Metamodelling
Application 2 – Rainfall-Runoff Modelling
A new Measure of Regularization
Summary
2
ReNN is:– Essentially a single-hidden-layer
neural network– Defined on a new set of
variables based on the network’s internal geometry
x1
Input Layer Output LayerHidden Layer
x2
xR
y
Iw1,1
Iw1,2
Iw1,R
Iw2,1Iw2,2
Iw 2,R
Iwn,1
Iwn,2
Iwn,R
Hw1
Hw2
Hw n
Ob
Hb1
Hb2
Hbn
Ho1
Ho2
Hon
Main Features:– ReNN is more efficient in training– ReNN variables are interpretable– ReNN is more predictable in
generalization
Reformulated Neural NetworkMultilayer Perceptron (Traditional Neural Network)
3
y
Ob
Iw 1,1
Iwn,1
Hw1
Hw n
Hb1
Hbn
Ho1
Hon 1
-1.5
-1.25
-1
-0.75
-0.5
-0.25
0
0.25
0.5
0.75
1
1.25
1.5
-1.5 -1.25 -1 -0.75 -0.5 -0.25 0 0.25 0.5 0.75 1 1.25 1.5
Out
put
Input
Hw
i .Ho i
x1
Hwi .Hoix1
Input Layer Output LayerHidden Layer
Iwi,1 Hwi
Hbi
Hoi
Height
Location
Slope
Hwi
di = -Hbi / Iwi,1
si = Hwi . Iwi,1
Reformulated Neural NetworkReNN variables in 1-input problems
4
a sigmoidal unit
y
Ob
Hb1
Hbn
Ho1
Hon
x1
Input Layer Output LayerHidden Layer
x2
Iwi,1
Iw i,2Hwi
Hbi
Hoi
x1x2
Hw
1 Ho 1 Height
Location
Angle
SlopeDirectional
Slopes
Height
Location &
Slope
Angle
New Variables:
Directional Slope 1
Directional Slope 2
Reformulated Neural NetworkReNN variables in 2-input problems
5
Details include: Generalized geometry & revised neural network formulation with respect to the new variablesDerive the partial derivatives of the network error function with respect to the new variables for back-propagation training algorithms
Razavi, S., and Tolson, B. A. (2011). "A new formulation for feedforward neural networks." IEEE Transactions on Neural Networks, 22(10), 1588-1598, DOI: 1510.1109/TNN.2011.2163169.
ReNN in n-input problems is non-trivial but for details see Razavi and Tolson (2011)
Example Applications …
6
0
0.04
0.08
0.12
0.16
0.2
0 1000 2000 3000 4000 5000
Mea
n Sq
uare
d E
rror
Epoch
ANN 2-10-1 ReNN 2-10-1
Test FunctionSWAT2000 Hydrologic ModelCannonsville Reservoir Watershed, NY
Saving
ReNN Efficiency in Training - Case Study 1
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0 500 1000 1500 2000 2500
Mea
n Sq
uare
d E
rror
Epoch
ANN 14-10-1 ReNN 14-10-1
Example Application in MetamodellingNeural networks are frequently used to model (emulate) computationally expensive models (e.g., in optimization, model calibration, real-time/ operational settings)
Network Training efficiency is very important.
Averaged over 50 trials
Averaged over 50 trials
7
trained with Standard Back-propagation Alg. trained with Standard Back-propagation Alg.
Precipitation Gauge 1(t)
Precipitation Gauge 2(t)
Precipitation Gauge 3(t)
Precipitation Gauge 4(t)
Average Precipitation (t-1)
Average Temperature (t)
Runoff (t)ReNN6-5-1
Interpretation of ReNN Variables – Case Study 2
Precipitation Gauge 3
WaltonRunoff Gauge
Precipitation Gauge 1
Precipitation Gauge 2
Precipitation Gauge 4
Cannonsville Reservoir Watershed New York (area = 1200 km2)
Example Application in Rainfall-Runoff Modelling(monthly)
Input-output data are scaled to [-1 +1]
8
Sigmoidal Unit 1 0.62
Sigmoidal Unit 2 -0.04
Sigmoidal Unit 3 0.81
Sigmoidal Unit 4 -8.65
Sigmoidal Unit 5 -0.77
25.92
-1.42
8.70
-6.60
-17.55
-0.05
0.25
-0.02
-2.65
-0.68
HeightsOverall Slopes Locations Output Bias
8.56-8.30 -5.72 -6.70 -4.16 -22.14 4.25
0.52 0.55 0.69 0.07 -0.74 -0.65
3.43 1.77 2.62 1.53 6.76 -2.44
1.39 2.61 -1.31 -0.80 5.46 1.65
1.30 1.85 2.41 1.93 -14.95 -5.07
Precipitation Gauge 3
WaltonRunoff Gauge
Precipitation Gauge 1
Precipitation Gauge 2
Precipitation Gauge 4
Cannonsville Reservoir Watershed New York (area = 1200 km2)
PR1(t) PR2(t) PR3(t) PR4(t) PR (t-1) TP(t)Directional Slopes
16-16
1
-1
0.8
Interpretation of ReNN Variables – Case Study 2
9
Sigmoidal Unit 1 0.62
Sigmoidal Unit 2 -0.04
Sigmoidal Unit 3 0.81
Sigmoidal Unit 4 -8.65
Sigmoidal Unit 5 -0.77
25.92
-1.42
8.70
-6.60
-17.55
-0.05
0.25
-0.02
-2.65
-0.68
HeightsOverall Slopes Locations Output Bias
8.56-8.30 -5.72 -6.70 -4.16 -22.14 4.25
0.52 0.55 0.69 0.07 -0.74 -0.65
3.43 1.77 2.62 1.53 6.76 -2.44
1.39 2.61 -1.31 -0.80 5.46 1.65
1.30 1.85 2.41 1.93 -14.95 -5.07
Precipitation Gauge 3
WaltonRunoff Gauge
Precipitation Gauge 1
Precipitation Gauge 2
Precipitation Gauge 4
Cannonsville Reservoir Watershed New York (area = 1200 km2)
PR1(t) PR2(t) PR3(t) PR4(t) PR (t-1) TP(t)Directional Slopes
8.65
8.56
Output Bias
2.65
Interpretation of ReNN Variables – Case Study 2
10
16-16
1
-1
regnew =
Performance function with regularization:
Conventional measure of regularization:
regconventional =
Performance Function = * mse + * regneww (1 - w)
ReNN Regularization Measure
This measure directly quantifies the smoothness of the network response.
Among two networks with the same accuracy on the training data, the one with smoother response (more regularized) is expected to have better generalizability
11
This measure is applicable to both ReNN and traditional neural networks
Summary
Razavi, S., and Tolson, B. A. (2011). "A new formulation for feedforward neural networks." IEEE Transactions on Neural Networks, 22(10), 1588-1598, DOI: 1510.1109/TNN.2011.2163169.
Reformulated Neural Network (ReNN) is an equivalent reformulation of multilayer perceptron (MLP) neural networks with the following benefits:
ReNN is trained faster,
ReNN has an interpretable Internal Geometry – e.g. useful for Sensitivity Analysis, and
ReNN has a direct measure of regularization (smoothness).
ReNN can turn into traditional neural network.For full information on ReNN formulation and derivations, please refer to:
12
Razavi, S., and Tolson, B. A. (2011). "A new formulation for feedforward neural networks." IEEE Transactions on Neural Networks, 22(10), 1588-1598, DOI: 1510.1109/TNN.2011.2163169.
For full information on ReNN formulation and derivations, please refer to:
Thank You