jump to first page. the generalised mapping regressor (gmr) neural network for inverse discontinuous...
TRANSCRIPT
![Page 1: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/1.jpg)
Jump to first page
![Page 2: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/2.jpg)
Jump to first page
The Generalised Mapping Regressor (GMR) neural
network for inverse discontinuous problems
The Generalised Mapping Regressor (GMR) neural
network for inverse discontinuous problems
Student : Chuan LU
Promotor : Prof. Sabine Van Huffel
Daily Supervisor : Dr. Giansalvo Cirrincione
![Page 3: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/3.jpg)
Jump to first page
Mapping Approximation Problem
Feedforward neural networks are : universal approximators of nonlinear continuous
functions (many-to-one, one-to-one) they don’t yield multiple solutions they don’t yield infinite solutions they don’t approximate mapping discontinuities
![Page 4: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/4.jpg)
Jump to first page
Inverse and Discontinuous Problems
Mapping : multi-valued, complex structure.
conditional average of the target data
Poor representation of the mapping by least squares approach (sum-of-squares error function) for feedforward neural networks.
Mapping with discontinuities.
![Page 5: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/5.jpg)
Jump to first page
gatinggatingnetworknetwork
Network 1 Network 2 Network 3
inputinput
outputoutputmixture-of-experts
It partitions the solution between several networks. It uses a separate network to determine the parameters of each kernel, with a further network to determine the coefficients.
winner-take-all
• Jacobs and Jordan• Bishop (ME extension)
kernel blending
![Page 6: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/6.jpg)
Jump to first page
Example #1
ME
MLP
![Page 7: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/7.jpg)
Jump to first page
Example #2
ME
MLP
![Page 8: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/8.jpg)
Jump to first page
Example #3
ME
MLP
![Page 9: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/9.jpg)
Jump to first page
Example #4
ME
MLP
![Page 10: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/10.jpg)
Jump to first page
Generalised Mapping Regressor( GMR )
(G. Cirrincione and M. Cirrincione, 1998)
approximate every kind of function or relation.
input : collection of components of x and y output : estimation of the remaining components output all solutions, mapping branches, equilevel hypersurfaces.
Characteristics :
nm yxyxM :),(
![Page 11: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/11.jpg)
Jump to first page
coarse-to-fine learning incremental competitive based on mapping recovery (curse of dimensionality)
topological neuron linking distance direction
linking tracking branches contours
open architecture
function approximation pattern recognition
Z (augmented) space unsupervised learning
GMR Basic Ideas
clusters mapping branches
![Page 12: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/12.jpg)
Jump to first page
GMR four phases
object merged
Object Merging
Learning Recall-ing
branch 1branch 2
INPUTINPUT
Linking
links
object 1
pool of neurons
object 2object 3
TrainingTrainingSetSet
![Page 13: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/13.jpg)
Jump to first page
EXIN Segmentation Neural Network (EXIN SNN)
clustering
(G. Cirrincione, 1998)
w4= x4
vigilance threshold
x
Input/weight space
![Page 14: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/14.jpg)
Z (augmented) space
coarse quantization• EXIN SNN• high z ( say 1 )
branch (object)neuron
GMR Learning
![Page 15: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/15.jpg)
Z (augmented) space
• production phase• Voronoi sets domain setting
GMR Learning
![Page 16: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/16.jpg)
Z (augmented) space
• secondary EXIN SNNs• z = 2 < 1
TS#1
TS#2
TS#3
TS#4
TS#5
Other levels are possible
fine quantization
GMR Learning
![Page 17: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/17.jpg)
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1PLN Level 1 1=0.2, epoch1=3
x
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1PLN Level 1 1=0.2, epoch1=3
x
GMR Coarse to fine Learning ( Example)
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1PLN level 1-2, 1=0.2, epoch1=3;2=0.1, epoch2=3
* 1st PLN: 13*
x
y
* 2nd PLN: 24*
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1PLN level 1-2, 1=0.2, epoch1=3;2=0.1, epoch2=3
* 1st PLN: 13*
x
y
* 2nd PLN: 24*
object neuron
fine VQ neurons
object neuron Voronoi set
![Page 18: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/18.jpg)
Jump to first page
GMR Linking Voronoi set: setup of the neuron radius (domain variable)
neuron i
ri
asymmetric radius
Task 1 :Task 1 : Task 1 :Task 1 :
![Page 19: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/19.jpg)
Jump to first page
Weight Space
GMR Linking For one TS presentation:
zi
d1
w1
w5
w3
w4
d1
w2
d5
d3
d4
d2
branch and bound search technique
k-nn
Linking candidates
distance test direction test create a link or strengthen a link
Task 2 :Task 2 : Task 2 :Task 2 :
Linking direction
![Page 20: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/20.jpg)
Jump to first page
Branch and Bound Accelerated Linking
neuron tree constructed during learning phase (multilevel EXIN SNN learning)
methods in linking candidate step (k-nearest-neighbors computation): -BnB : < d1 , ( : linking factor predefined) k-BnB : k predefined.
![Page 21: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/21.jpg)
Jump to first page
44 43
3127
6459
5547
7681 80 83
0,00%10,00%20,00%30,00%40,00%50,00%60,00%70,00%80,00%90,00%
2-D (TS 2k): 8 2-D(TS 4k): 24 3-D (TS 3k): 199 linking flops (x100,000)
percents of linking flops saved by branch and bound
2-level d-BnB
2-level k-BnB
3-level d-BnB
3-level k-BnB
GMR Linking
branch-and-bound in linking experimental results:83 %
![Page 22: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/22.jpg)
Jump to first page
branch and bound (cont.)
Apply branch and bound in learning phase ( labelling ) :
Tree construction k-means EXIN SNN
Experimental results (in the 3-D example) 50% of labeling flops are saved
![Page 23: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/23.jpg)
GMR Linking Example
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
Linking: = 2.5
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
Linking: = 2.5
link
![Page 24: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/24.jpg)
GMR Merging Example
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
Merging: threshold = 1
Obj: 13 -> 3
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
Merging: threshold = 1
Obj: 13 -> 3
![Page 25: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/25.jpg)
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
x = 0.2
Level 1 neurons: 3
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
x = 0.2
Level 1 neurons: 3
GMR Recalling Example
)04.0
01.0)2(sin(
4
1)(
2
xxxfy )
04.0
01.0)2(sin(
4
1)(
2
xxxfy
level 1 neuron
level 2 neuron
branch 1
branch 2
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
y = 0.6
Level 1 neurons: 1
-0.6 -0.4 -0.2 0 0.2 0.4 0.6-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
y = 0.6
Level 1 neurons: 1
level one neurons : input within their domain level two neurons : only connected ones level zero neurons : isolated (noise)
![Page 26: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/26.jpg)
Experiments
spiral of Archimedes = a (a = 1)
spiral of Archimedes = a (a = 1)
![Page 27: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/27.jpg)
Experiments
Sparse regions
further normalizing + higher mapping resolution
)04.0
01.0)2(sin(
4
1)(
2
xxxfy )
04.0
01.0)2(sin(
4
1)(
2
xxxfy
![Page 28: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/28.jpg)
Experiments noisy data
1 Bernoulli of lemniscate
222222
a
yxayx 1 Bernoulli of lemniscate
222222
a
yxayx
![Page 29: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/29.jpg)
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1
-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
Solutions for y = -0.5
Level 1 neurons: 6
Experiments
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1
-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
Solutions for y = -0.1
Level 1 neurons: 10
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1
-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
Solutions for y = 0.5
Level 1 neurons: 5
-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1
-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
x
y
Solutions for y = 1
Level 1 neurons: 19
5,3 Lissajous of curve
sin,cos
ba
btyatx 5,3 Lissajous of curve
sin,cos
ba
btyatx
![Page 30: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/30.jpg)
Experiments
contours : links among
level one neurons
GMR mapping of 8 spheres in a 3-D scene.
![Page 31: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/31.jpg)
Jump to first page
Conclusions
GMR is able to : solve inverse discontinuous problems approximate every kind of mapping
yield all the solutions and the corresponding branches
GMR can be accelerated by applying tree search techniques
GMR needs : interpolation techniques kernels or projection techniques for high dimensional data adaptive parameters
![Page 32: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/32.jpg)
Jump to first page
Thank you !(shi-a shi-a)
![Page 33: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/33.jpg)
l1 = 0b1 = 0
l1 = 0b1 = 0
l6 = 0b6 = 0
l6 = 0b6 = 0
l5 = 0b5 = 0
l5 = 0b5 = 0
l2= 0b2= 0
l2= 0b2= 0
l3 = 0b3 = 0
l3 = 0b3 = 0 l4 = 0
b4 = 0
l4 = 0b4 = 0
l7 = 0b7 = 0
l7 = 0b7 = 0
l8= 0b8 = 0
l8= 0b8 = 0
l3 = 2b3 = 1
l3 = 2b3 = 1
GMR Recall
input
w1
w2
w3
w7
w8
w4
w5
w6
r1
l1 = 1b1 = 1
l1 = 1b1 = 1
linking tracking
restricted distance
level one test
connected neuron :level zero level two
branch the winner branch
![Page 34: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/34.jpg)
GMR Recall
input
w1
w2
w3
w7
w8
l1 = 0b1 = 0
l1 = 0b1 = 0
l6 = 0b6 = 0
l6 = 0b6 = 0
l5 = 0b5 = 0
l5 = 0b5 = 0
l2= 0b2= 0
l2= 0b2= 0
l3 = 0b3 = 0
l3 = 0b3 = 0 l4 = 0
b4 = 0
l4 = 0b4 = 0
l7 = 0b7 = 0
l7 = 0b7 = 0
l8= 0b8 = 0
l8= 0b8 = 0
w4
w5
w6
r2
l1 = 1b1 = 1
l1 = 1b1 = 1
l3 = 2b3 = 1
l3 = 2b3 = 1
l2= 1b2= 2
l2= 1b2= 2 l2= 1b2= 1
l2= 1b2= 1
level one test
linking tracking
branchcross
![Page 35: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/35.jpg)
GMR Recall
l6 = 0b6 = 0
l6 = 0b6 = 0 l6 = 2b6 = 4
l6 = 2b6 = 4 l6 = 1b6 = 6
l6 = 1b6 = 6
input
w1
w2
w3
l1 = 0b1 = 0
l1 = 0b1 = 0
l5 = 0b5 = 0
l5 = 0b5 = 0
l2= 0b2= 0
l2= 0b2= 0
l3 = 0b3 = 0
l3 = 0b3 = 0 l4 = 0
b4 = 0
l4 = 0b4 = 0
l7 = 0b7 = 0
l7 = 0b7 = 0
l8= 0b8 = 0
l8= 0b8 = 0
w4
w5
w6
l1 = 1b1 = 1
l1 = 1b1 = 1
l3 = 2b3 = 1
l3 = 2b3 = 1
l2= 1b2= 2
l2= 1b2= 2 l2= 1b2= 1
l2= 1b2= 1
l4 = 1b4 = 4
l4 = 1b4 = 4
l5 = 2b5 = 4
l5 = 2b5 = 4 l4 = 1b4 = 5
l4 = 1b4 = 5 l4 = 1b4 = 4
l4 = 1b4 = 4
… until completion of the candidates
level one neurons : input within their domain level two neurons : only connected ones level zero neurons : isolated (noise)
w7
w8
l6 = 1b6 = 4
l6 = 1b6 = 4
clipping
Tow Branches
Tow Branches
Two Branches
Two Branches
![Page 36: Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine](https://reader036.vdocuments.mx/reader036/viewer/2022062500/5697c0041a28abf838cc49eb/html5/thumbnails/36.jpg)
GMR Recall
input
w1
w2
w3
w7
w8
l7 = 0b7 = 0
l7 = 0b7 = 0
l8= 0b8 = 0
l8= 0b8 = 0
w4
w5
w6
Output = weight complements of the level one neurons Output interpolation
l1 = 1b1 = 1
l1 = 1b1 = 1
l3 = 2b3 = 1
l3 = 2b3 = 1
l2= 1b2= 1
l2= 1b2= 1
l4 = 1b4 = 4
l4 = 1b4 = 4
l4 = 1b4 = 4
l4 = 1b4 = 4 l6 = 1
b6 = 4
l6 = 1b6 = 4