ch. 4: radial basis functions stephen marsland, machine learning: an algorithmic perspective. crc...

27
Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin Jan Latecki Temple University [email protected]

Upload: gerard-sanders

Post on 17-Dec-2015

221 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An

Algorithmic Perspective. CRC 2009based on slides from many Internet sources

Longin Jan LateckiTemple University

[email protected]

Page 2: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

Perceptorn

Page 3: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

In RBFN

Page 4: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

architecture

Input layer Hidden layer Output layer

x1

x2

x3

xn

h1

h2

h3

hm

f(x)

W1

W2

W3

Wm

Page 5: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

Three layers

• Input layer– Source nodes that connect to the network to its

environment

• Hidden layer– Hidden units provide a set of basis function– High dimensionality

• Output layer– Linear combination of hidden functions

architecture

Page 6: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

Radial basis function

hj(x) = exp( -(x-cj)2 / rj2 )

f(x) = wjhj(x)j=1

m

Where cj is center of a region,

rj is width of the receptive field

architecture

Page 7: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

7

Function Approximation with Radial Basis Functions

RBF Networks approximate functions using (radial) basis functions as the building blocks.

Page 8: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

8

Exact Interpolation• RBFs have their origins in techniques for performing

exact function interpolation [Bishop, 1995]:– Find a function h(x) such that

h(xn) = tn n=1, ... N

• Radial Basis Function approach (Powel 1987):– Use a set of N basis functions of the form (||x-xn||),

one for each point,where (.) is some non-linear function.

– Output: h(x) = n wn (||x-xn||)

Page 9: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

9

Exact Interpolation• Goal (exact interpolation):

– Find a function h(x) such that h(xn) = tn n=1, ... N

• Radial Basis Function approach (Powel 1987):– Use a set of N basis functions of the form (||x-xn||), one

for each point,where (.) is some non-linear function.– Output: h(x) = n wn (||x-xn||)

w1 (||x1-x1||) + w2 (||x1-x2||) + ... + wN (||x1-xN||) = t1

w1 (||x2-x1||) + w2 (||x2-x2||) + ... + wN (||x2-xN||) = t2 W = T...

w1 (||xN-x1||) + w2 (||xN-x2||) + ... + wN (||xN-xN||)= tN

Page 10: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

10

Exact Interpolation

Page 11: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

11

Exact Interpolation

Page 12: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

12

• Due to noise that may be present in the data exact interpolation is rarely useful.

• By introducing a number of modifications, we arrive at RBF networks:

• Complexity rather than the size of the data is what is important

– Number of the basis functions need not be equal to N• Centers need not be constrained by the input• Each basis function can have its own adjustable width

parameter • Bias parameter may be included in the linear sum.

Page 13: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

13

Illustrative Example - XOR Problem

2

1

Page 14: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

14

Function Approximation via Basis Functions and RBF Networks

• Using nonlinear functions, we can convert a nonlinearly separable problem• into a linearly separable one.

• From a function approximation perspective, this is equivalent to implementing a complex function (corresponding to the nonlinearly separable decision boundary) using simple functions (corresponding to the linearly separable decision boundary)

• Implementing this procedure using a network architecture, yields the RBF• networks, if the nonlinear mapping functions are radial basis functions.

• Radial Basis Functions:– Radial: Symmetric around its center– Basis Functions: Also called kernels, a set of functions whose linear combination can

generate an arbitrary function in a given function space.

Page 15: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

15

RBF Networks

Page 16: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

16

RBF Networks

Page 17: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

17

Network Parameters• What do these parameters represent?

• • : The radial basis function for the hidden layer. • This is a simple nonlinear mapping function (typically Gaussian)

that transforms the d- dimensional input patterns to a (typically higher) H-dimensional space. The complex decision boundary will be constructed from linear combinations (weighted sums) of these simple building blocks.

• • uJi: The weights joining the first to hidden layer. These weights constitute the center points of the radial basis functions. Also called prototypes of data.

• • : The spread constant(s). These values determine the spread (extend) of each radial basis function.

• • Wjk: The weights joining hidden and output layers. These are the weights which are used in obtaining the linear combination of the radial basis functions. They determine the relative amplitudes of the RBFs when they are combined to form the complex function.

• • ||x-uJ||: the Euclidean distance between the input x and the prototype vector uJ. Activation of the hidden unit is determined according to this distance through

Page 18: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

18

Page 19: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

19

Training RBF Networks

Approach 1: Exact RBFApproach 2: Fixed centers selected at randomApproach 3: Centers are obtained from clusteringApproach 4: Fully supervised training

Page 20: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

20

Training RBF Networks

• Approach 1: Exact RBF • Guarantees correct classification of all training

data instances.

• Requires N hidden layer nodes, one for each training instance.

• No iterative training is involved: w are obtained by solving a set of linear equations

• Non-smooth, bad generalization

Page 21: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

21

Exact Interpolation

Page 22: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

22

Exact Interpolation

Page 23: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

23

Too Many Receptive Fields?• In order to reduce the artificial complexity of the

RBF, we need to use fewer number of receptive fields.

• Approach 2: Fixed centers selected at random.• Use M < N data points as the receptive field centers.• Fast but may require excessive centers

• • Approach 3: Centers are obtained from

unsupervised learning (clustering). • Centers no longer has to coincide with data points• This is the most commonly used procedure, providing

good results.

Page 24: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

24

Approach 2 Approach 3 Approach 3.b

Page 25: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

25

Determining the Output Weights through learning (LMS)

Page 26: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

26

RBFs for Classification

Page 27: Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin

Homework

• Problem 4.1, p. 117• Problem 4.2, p. 117