noiseproof edge detector on base of neuron net
DESCRIPTION
Student research project Phoenix-3. NOISEPROOF EDGE DETECTOR ON BASE OF NEURON NET. Alex Astapkovitch, Head of the Student Design Cent er State University of Aerospace Instrumentation Saint-Petersburg,Russia 2010. STUDENT RESEARCH PROJECTS “Phoenix-X” HISTORY. Real robots: PHOENIX-X. - PowerPoint PPT PresentationTRANSCRIPT
NOISEPROOF EDGE DETECTOR NOISEPROOF EDGE DETECTOR
ON BASE OF NEURON NET ON BASE OF NEURON NET
Alex Astapkovitch,Alex Astapkovitch, Head of the Student Design CentHead of the Student Design Centerer
State University of Aerospace InstrumentationState University of Aerospace InstrumentationSaint-Petersburg,RussiaSaint-Petersburg,Russia
20102010
Student research project Phoenix-3
STUDENT RESEARCH PROJECTS “Phoenix-X” HISTORYSTUDENT RESEARCH PROJECTS “Phoenix-X” HISTORY
Real robots:Real robots:
PHOENIX-XPHOENIX-X
Virtual robots:Virtual robots:
2006 - 20072006 - 2007 2007 - 20082007 - 2008 2008 - 20092008 - 2009 20102010
SOFA-2009Virtual robotbenchmark
model
Neuron net Noise proofEdge Detector
- Distant immobilizer;- Video system;
- Details on site guap.ru/english version/student design center
PHOENIX-X PHILOSOPHY
•Vision system is necessary part of most modern robot ;
•Edge detector is a part of robot vision system ;
LEGEND OF PHOENIX-3 PROJECT:Autonomic robot Phoenix-3 is designing to be able to patrol the determined area with the purpose of detection the centers of the flame. In case of the flame detection the robot should come nearer and use the onboard fire extinguisher to eliminate flaming. For orientation the video shock-proof camera with the rotary mechanism and a zoom lens is supposed to be used.
- The strategic goal of the “Phoenix-X” projects is a developing of the understanding of the supervised learning for the control system on base of a neuron net.
LAPLAS EDGE FILTERLAPLAS EDGE FILTER
NOISY_TEST
TEST_FIG
0 -1 0
-1 4 -1
0 -1 0
CLEAN IMAGE “DOG AND BALL”
LF_FIG_1
NOISY IMAGE (UNIFORM NOISE MODEL )
FILTERED
LF_NTEST_FIG
CONCEPT OF THE NEURON FILTER CONCEPT OF THE NEURON FILTER
GLOSSARYGLOSSARY
- “ - “ LEARNING” stands for the procedure of the determination LEARNING” stands for the procedure of the determination meanings of weight vector W and thresholds of activation function ;meanings of weight vector W and thresholds of activation function ;
““SUPERVISED LEARNINGSUPERVISED LEARNING “ stands for LEARNING with set of “ stands for LEARNING with set of “SAMPLES” ; “SAMPLES” ;
- SAMPLE is pair of frames: RAW FRAME and RESULTING FRAME ;
S1
S2
SNSEN
1
(S,W)
RESi,j
S1 S2 S3
S4 S5 S6
S7 S8 S9
FRAME
FILTER
Piecewise linear activation function
F(x)
THmin THmax
ONE STEP SUPERVIZED LEARNING PROCEDURE ONE STEP SUPERVIZED LEARNING PROCEDURE
S1(0,0) S2(0,0)
S1(0,1) S2(0,1) ................. 1……………………………….. S1(i,j) S2(i,j ) Snsen(i,j) 1
…………………………………
w1
w2
wnsen+1
F(0,0)
F(0,1)
…..
………F( i,j)………
* =
min F(w) = (SW - F, SW – F) + (W,W) w
Tichonov regularization provides stable solution
W = (ST S + E) –1 ST F
Weights calculation is one step procedure :
S * W = F - the bad posed problem for W
S - rectangular matrix, formed from RAW sample, F - sample vector ;
W - unknown neuron filter weight vector ;
MULTISAMPLE LEARNING PROCEDUREMULTISAMPLE LEARNING PROCEDURE
ONE STEP WEIGHT CALCULATION PROCEDURE W = (ST S + E) –1 ST F
Sample_1 : S1 F1
Sample_2 : S2 F2
S =
S1
S2 F =
F1
F2
MULTISAMPLE LEARNING
Let us introduce :Sek = ∑ Sk
TSk - experience matrix for k samplesFek = ∑ SkTFk - experience vector for k samples
Wk+1 = (Sek + S k+1T Sk+1 + E) –1 * (Fek+Sk+1
T Fk+1)
W1 = (S1T S1 + E) –1 * S1 T F1
W2 = (S1T S1 + S 2
T S2 + E) –1 * (S1T F1+S2
T F2)
ONE SAMPLE LEARNING
TWO SAMLE LEARNING
TWO SAMPLE CASE
LEARNING WITH HEURISTICLEARNING WITH HEURISTIC
50 50 50 50 50 50
50 50 50 50 50 5050 50 250 250 250 25050 50 250 250 250 25050 50 250 250 250 250
RESULTING FRAME is LAPLAS FILTERING OF RAW FRAME
SAMPLE_1 RAW FRAME
TEST OF THE LEARNING PROCEDURERAW FRAME : NOISY_REC1RES FRAME : LAPLAS FILTERED NOISY FRAME
LF_CLEAN_REC1CLEAN_REC1
NOISY_REC1 LF_NOISY_REC1
LN_WEIGTHS
W1_L0
W1_L3
W1_L6
W1_L1
W1_L4
W1_L7
W1_L2
W1_L5
W1_L8
1.269 106
1
1.269 106
1
4
1
1.269 106
1
1.269 106
LN_WEIGHTS_CONST W1_L9
2.392 1013
0 0 0 0 0 0
0 0 -200 -200 -200 -2000 -200 400 200 200 2000 -200 200 0 0 00 -200 200 0 0 00 -200 200 0 0 0
-100*100 bmp artificially formed sample was used;
- 10th sensor of constant part was added;
NEURONLIKE LAPLAS EDGE FILTER NEURONLIKE LAPLAS EDGE FILTER
NOISY_TESTTEST_FIG
LF_NFIG_1
CLEAN AND NOISY BMP IMAGE “DOG AND BALL”
LF_FIG_1
FILTERING WITH PURE LAPLAS
THmin = 60
- UNIFORM NOISE MODEL ;
- NOISE AMPLITUDE 60 ;
- DOG BODY AMPLITUDE 150 ;
LF_NTEST_FIG
THmin = -20
NEURONLIKE LAPLAS EDGE FILTER WITH THRESHOLDS
LF_NFIG_1
NEURON “LINEAR” FILTER LEARNINGNEURON “LINEAR” FILTER LEARNING
WHAT LENGTH OF FILTER HAS TO BE USED ?
Due to symmetry conditions - 3*3 neuron filter has the 3 free weights - 5*5 neuron filter has the 3+3 = 6 free weights - 7*7 neuron filter has the 6+4 = 10 free weights
WHAT THE SAMPLE SET HAS TO BE USED ?
HOW MUCH SAMPLES HAVE TO BE USED?
- HAND MADE BORDER WITH
REGULATED THICNESS (here is 2);
- LAPLAS, SOBEL, CANNY FILTERED
BORDERS ;
CLEAN_REC1 BORDER_REC1 NOISY_REC1
CLEAN_BAL1 BORDER_BAL1 NOISY_BAL1
- Rectangular, cycles …….? ;
- Mixed set of samples …..?
Experiments with supervised earning
NOISY_REC1 LF_CLEAN_REC1 BORDER_REC1
SAMPLES SET
NOISY_BAL1BORDER_BAL1
LF_CLEAN_BAL1
LAPLAS BORDERS
HAND MADE BORDERS
NOISY FRAMES
-5*5 neuron is learned with
different samples;
- neuron filter is tested with
noisy ”DOG AND BALL” ;
- low threshold is selected
by “hand” ;
NUMERICAL EXPERIMENTS - I
H51
0.155
0.067
0.07
0.086
0.136
0.07
9.212 103
0.169
0.013
0.059
0.084
0.17
0.376
0.155
0.091
0.075
6.329 103
0.175
6.182 103
0.065
0.139
0.079
0.083
0.077
0.152
H52
0.105
0.086
0.018
0.084
0.101
0.071
0.023
0.146
0.019
0.064
2.54 103
0.149
0.533
0.136
3.808 103
0.075
0.027
0.15
0.013
0.071
0.099
0.085
9.57 103
0.097
0.095
H51_CONST 4.991H52_CONST 2.265
NHL5_FIG_1C NHL5_FIG_2C NHL5_FIG_1N NHL5_FIG_2N
- HAND MADE BORDER WITH THICKNESS 2 WERE USED;
- REC and REC+BALL LEARNING SETS WERE USED;
- WEIGHT VECTOR TRANSFORMATION TO FILTER MASK :
TEST FIGURE (CLEAN AND NOISY) FILTERED WITH NEURON (TR min = 50)
CLEAN : H51 H52 NOISY: H51 H52
CONCLUSION: PERFOMANCE OF FILTER LEARNED WITH TWO SAMPLES IS BETTER.
NUMERICAL EXPERIMENTS - II
L51
4.399 103
0.151
0.245
0.153
2.188 103
0.156
0.293
0.169
0.337
0.142
0.251
0.16
0.824
0.132
0.287
0.152
0.314
0.167
0.325
0.148
0.011
0.152
0.273
0.152
0.017
L52
8.123 103
0.084
0.077
0.07
6.509 103
0.073
0.2
0.081
0.218
0.065
0.097
0.121
1.504
0.116
0.105
0.086
0.173
0.103
0.206
0.079
0.016
0.076
0.103
0.091
0.012
L51_CONST 0.091L52_CONST 0.135
- LAPLAS FILTERED BORDER WERE USED;
- REC and REC+BALL LEARNING SETS WERE USED;
- WEIGHT VECTOR TRANSFORMATION TO FILTER MASK :
TEST FIGURE (CLEAN AND NOISY) FILTERED WITH NEURON (TR min = 65)
NLL5_FIG_1C NLL5_FIG_2C NLL5_FIG_1N NLL5_FIG_2N
CLEAN : L51 L52 NOISY: L51 L52
CONCLUSION: IT IS POSSIBLE TO “STEAL” HEURISTIC ALGORITHM THROUGH LEARNING.
NUMERICAL EXPERIMENTS - III
H51
1.431 103
0.069
0.137
0.083
4.21 103
0.073
0.143
0.094
0.164
0.066
0.134
0.09
0.371
0.076
0.151
0.071
0.148
0.091
0.159
0.074
8.796 103
0.071
0.145
0.07
3.713 104
H52
6.134 103
0.051
0.034
0.052
0.012
0.036
0.117
0.044
0.128
0.037
0.051
0.034
0.541
0.036
0.054
0.041
0.107
0.04
0.13
0.041
4.03 103
0.043
0.038
0.047
9.927 103
H51_CONST 4.991H52_CONST 2.265
- HAND MADE BORDER WITH THICKNESS 1 WERE USED;
TEST FIGURE (CLEAN AND NOISY) FILTERED WITH NEURON
NHL5_FIG_2N
TRmin=40
NHL5_FIG_1N
TRmin=40
NHL5_FIG_2C
TRmin=30
CLEAN : H51
NOISY: H51 H52
CONCLUSION: MEAUSURE OF QUALITY OF FILTERING HAS TO BE INTRODUCED.
Learning asymmetry effect
- Learning asymmetry effect was discovered with SOFA-2009 model for the robot cruise neurocontroller ; - In sense of neuron edge detector effect results in the asymmetry of the weight values of the weights with the symmetry position ;
- Norm of the weight asymmetry can be introduced that can be used for estimation of quality of learning;
ONE SAMPLE LEARNING TWO SAMPLE LEARNING
H51
1.431 103
0.069
0.137
0.083
4.21 103
0.073
0.143
0.094
0.164
0.066
0.134
0.09
0.371
0.076
0.151
0.071
0.148
0.091
0.159
0.074
8.796 103
0.071
0.145
0.07
3.713 104
H52
6.134 103
0.051
0.034
0.052
0.012
0.036
0.117
0.044
0.128
0.037
0.051
0.034
0.541
0.036
0.054
0.041
0.107
0.04
0.13
0.041
4.03 103
0.043
0.038
0.047
9.927 103
H51_CONST 4.991H52_CONST 2.265
CONCLUSIONS : IT IS POSSIBLE TO CONTROL FILTER QUALITY WITH SOME ASSYMETRY NORM.
Modified One Step Learning Procedure - I
• The one possible way to solve asymmetry problem is using the description of the relation between the weights in explicit form;
• For learning asymmetry there are exist linear relation between the weights at the symmetry position in filter matrix;
• One step learning procedure on base of Lagrange multipliers provides possibility to take into account the existence of the linear relations for weights and avoid asymmetry effects also;
• This procedure was tested with SOFA-2009 model ;
Modified One Step Learning Procedure - II
• Learning asymmetry problem can be solved if one take into account the linear relations between weights
W(k,i) = W(m,n)
• In common way the set of this relations can be presented as Nsym symmetry conditions, expressed in matrix form
L W = b
• Lagrange multipliers method for one step learning procedure can be formulated as linear programming optimization problem:
min F(W) = (SW - F, SW – F) + (W,W) + Dμ (LW-b) W,Dμ
Modified One Step Learning Procedure - III
• Let us μ is the vector that is formed from Lagrange multipliers
μ = [μ1, μ2, ……. μ Nsym ]T
• For modified vectors and matrixes Wμ = [ W μ]T Fμ = [ F b]T
• Solution is a vector
Wμ= (SμTSμ+ Eμ)-1 SμT Fμ
E 0
0
0Eμ =S L
LT 0Sμ =
CONCLUSIONS : IT IS POSSIBLE TO PROVIDE SYMMETRY OF FILTER WITH THE
LAGRANGE MYLTIPLIERS METHOD AND THE SAME LEARNING
PROCEDURE.
MASTER_PIC
M_TEST_52H_30 M_TEST_52H_60 M_TEST_52L_30 M_TEST_52L_60
STANDART REAL IMAGE TEST
- NEURON FILTER 5*5 LEARNED WITH :
- TWO LAPLAS BORDERS SAMPLES;
- TWO HAND MADE BORDERS WITH THICNESS 2;
- LOW THRESHOLDS WAS 30 and 60 ;
CONCLUSION: FILTER PERFOMANCE IS EXELENT.
LEARNING WITH 2 HAND MADE BORDERS LEARNING WITH LAPLAS FILTERED BORDERS
CONCLUSIONS• PROPOSED APROACH IS EXTREMELY FLEXIBLE AND PROVIDES POSSIBILITY
TO GENERATE THROGHT LEARNING A VARIETY OF FILTERS : LINEAR,NONLINEAR, LOCAL, DISTRIBUTED AND SO ON;
• IT IS POSSIBLE TO USE AN VARIETY OF EXISTING FILTERS AS LEARNING SAMPLE SET;
• SOME WORK HAS TO BE DONE;
Supporting publications
1. Astapkovitch A.M. Learning Asymmetry Effect for the Neuron Net Control Systems. Proc. International
forum “Modern information society formation – problems, perspectives,innovation approaches ”, p.7-13,SUAI Saint-Petersburg,June 6-11, 2009
2. Astapkovitch A.M. Virtual mobile robot SOFA-2009 Proc. International forum “Information and communication technologies and higher education - priorities of
modern society development”, p.7-15,SUAI Saint-Petersburg, 20093. Astapkovitch A.M. Оne step learning procedure for neural net control system. Proc. International forum
“Information systems. Problems, perspectives , innovation approaches” , p.3-9,SUAI Saint-Petersburg, 20074. http://guap.ru > english version > student design center > student projects >
SOFA-2009 and publications
AND FAREWELLS
I LEAVED SUAI AND NOW IS WORKING FOR “LANIT-TERCOM” COMPANY THAT IS ROOTED TO STATE SAINT-PETERSBURG UNIVERSITY MATH-MECH FACULTY.
THANK FOR ALL PHOENIX-X STUDENT RESEARCH PROJECTS TEAMS OF DIFFERENT YEARS. AND I HOPE THAT YOU REMEMBER THE PHOENIX BIRD LEGEND.
Sincerely yours : [email protected]