forward & backward selection in hybrid network

23
06/20/22 1 Forward & Backward selection in hybrid network

Upload: ray-ramsey

Post on 31-Dec-2015

41 views

Category:

Documents


2 download

DESCRIPTION

Forward & Backward selection in hybrid network. Introduction. A training algorithm for an hybrid neural network for regression. Hybrid neural network has hidden layer that has RBF or projection units (Perceptrons). When is it good?. Hidden Units. RBF:. MLP:. Overall algorithm. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Forward & Backward selection in hybrid network

04/19/23 1

Forward & Backward selection in hybrid network

Page 2: Forward & Backward selection in hybrid network

04/19/23 2

Introduction

A training algorithm for an hybrid neural network for regression.

Hybrid neural network has hidden layer that has RBF or projection units (Perceptrons).

Page 3: Forward & Backward selection in hybrid network

04/19/23 3

When is it good?

Page 4: Forward & Backward selection in hybrid network

04/19/23 4

Hidden Units

RBF:

MLP:

Page 5: Forward & Backward selection in hybrid network

04/19/23 5

Overall algorithm

Divide input space and assign units to each sub-region.

Optimize parameters. Prune un-necessary weights using

Bayesian Information Criteria.

Page 6: Forward & Backward selection in hybrid network

04/19/23 6

Forward leg

Divide the input space into sub-regions

Select type of hidden unit for each sub-region

Stop when error goal or maximum number of units is achieved.

Page 7: Forward & Backward selection in hybrid network

04/19/23 7

Input space division

Like CART using

Maximum reduction in

Page 8: Forward & Backward selection in hybrid network

04/19/23 8

Unit type selection (RBF)

Page 9: Forward & Backward selection in hybrid network

04/19/23 9

Unit type selection (projection)

Page 10: Forward & Backward selection in hybrid network

04/19/23 10

Units parameters

RBF unit: center at maximum point.

Projection unit: weight normalized of maximum point

Page 11: Forward & Backward selection in hybrid network

04/19/23 11

ML estimate for unit type

Page 12: Forward & Backward selection in hybrid network

04/19/23 12

Pruning

Target function values corrupted with Gaussian noise

Page 13: Forward & Backward selection in hybrid network

04/19/23 13

BIC approximation

Schwartz, Kass and Raftery

Page 14: Forward & Backward selection in hybrid network

04/19/23 14

Evidence for the model

Page 15: Forward & Backward selection in hybrid network

04/19/23 15

Evidence for unit type1

Page 16: Forward & Backward selection in hybrid network

04/19/23 16

Evidence for unit type cont2’

Page 17: Forward & Backward selection in hybrid network

04/19/23 17

Evidence fore unit type cont3’

Page 18: Forward & Backward selection in hybrid network

04/19/23 18

Evidence Unit Type alg4.

Initialize alfa and beta Loop: compute w,wo Recompute alfa and beta Until difference in the evidence is

low.

Page 19: Forward & Backward selection in hybrid network

04/19/23 19

Pumadyn data set DELVE archive Dynamic of a puma robot arm. Target: annular acceleration of one

of the links. Inputs: various joint angles,

velocities and torques. Large Guassian noise. Data set non linear. Input dimension: 8, 32.

Page 20: Forward & Backward selection in hybrid network

04/19/23 20

Results pumadyn-32nh

Page 21: Forward & Backward selection in hybrid network

04/19/23 21

Results pumadyn-8nh

Page 22: Forward & Backward selection in hybrid network

04/19/23 22

Related work

Hassibi et al. with Optimal Brain Surgeon

Mackey with Bayesian inference of weights and regularization parameters.

HME Jordan and Jacob, division on input space.

Kass & Raftery Schwarz with BIC.

Page 23: Forward & Backward selection in hybrid network

04/19/23 23

Discussion

Pruning removes 90% of parameters. Pruning reduces variance of estimator. The pruning algorithm is slow. PRBFN better then MLP of RBF alone. Bayesian techniques disadvantage: the

prior distribution parameter. Bayesian techniques are better then LRT. Unit type selection is a crucial element in

PRBFN Curse of dimensionality is well seen on

pumadyn data sets.