4540samplereport.doc

Upload: almodhafar

Post on 03-Apr-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/29/2019 4540SampleReport.doc

    1/20

    ASSIGNMENT 5

    CONTINUOUS ONLINE IDENTIFICATION

    WHEN A STEP CHANGE OCCURS IN THE

    FUNCTION TO BE IDENTIFIED

    SCOTT LARISCH

    ECE 8803A

    G E O R G I A I N S T I T U T E O F T E C H N O L O G Y

  • 7/29/2019 4540SampleReport.doc

    2/20

    Assignment 5ECE 8803A

    INTRODUCTION

    This assignment extends Assignment 4, but now we will introduce anabrupt change to the function to be identified after 50 seconds of training.

    The structure of the neural network is shown in Figure 1. The AdaptiveNeural Network (ANN) identifier uses a Nonlinear Auto Regressive MovingAverage (NARMA) model. Insteasd of a single input,x, along with a bias of 1,now samples ofxand previous estimates along with a bias function of 1 arefed to each of the neurons in the hidden layer. Each neuron applies itsweights, W, and then evaluates its sigmoidal function. The weights V are

    applied to the sigmoidal function ouputs and the products are summed toform the neural network estimate, .

    Figure 1. Block diagram of the continuous online identifier to besimulated.

    As in Assignment 4, the inputxis a time varying function given by:

    x(t) = sin(2t)

    Initially the function of our plant is:

    y(t) = 2x2

    + 1

    After 50 seconds of training the function will be abruptly changed to:

    y(t) = 2.5(x 0.2)2 + 1.5

  • 7/29/2019 4540SampleReport.doc

    3/20

    We will use on-line training, and so we have to decide on a samplingfrequency. Simulation runs will start at t = 0 and sample at every 10milliseconds. In other words, the sampling frequency will be 100 Hz.

    We will investigate the effect on system performance by changing theorder of the ANN model. Initially simulation runs will be made for a zeroeth

    order model, i.e., the inputs to the ANN will bex(t) and a bias term of 1. Notethat the zeroeth order model was the model investigated in the first threeassignments. Then we will change the ANN to a first order system. Here theinputs will be x(t), x(t-1), y(t-1), and a bias term of 1. Table 1 gives adescription of the orders of the model.

    System Order X Y

    0 x(t)

    1 x(t), x(t-1) y(t-1)

    2 x(t), x(t-1), x(t-2) y(t-1), y(t-2)3 x(t), x(t-1), x(t-2), x(t-3) y(t-1), y(t-2), y(t-3)

    4 x(t), x(t-1), x(t-2), x(t-3),x(t-4)

    y(t-1), y(t-2), y(t-3), y(t-4)

    5 x(t), x(t-1), x(t-2), x(t-3),x(t-4), x(t-5)

    y(t-1), y(t-2), y(t-3), y(t-4),y(t-5

    Table 1. Model order descriptions for Assignment 5.

    As in the previous Assignments, we will use the delta weight update

    strategy with various learning gains, kl, momentum gains, km, and number ofhidden layer neurons. For the delta weight update strategy, first the newweights, Vnew, are calculated as:

    dVnew = kl * e * DT+ km * dVold

    Vnew = Vold + dVnew

    Then the new weights, Wnew, are calculated as:

    dWnew = kl * e * Vnew * d * (1-d) *X+ km * dWold

    Wnew = Wold + dWnew

    2

  • 7/29/2019 4540SampleReport.doc

    4/20

    The output tracking error is calculated as:

    e =y VT * d

    We will plot the tracking error, e, versus the number of epochs. The meansquare error (MSE) will be calculated for each combination of number of

    neurons in the hidden layer, learning gain, momentum gain, and identifierorder. Note that, since there is only one output for this system, the meansquare error is calculated by summing the square of the tracking error ateach epoch and then dividing by the total number of epochs,

    MSE = (1/epochs) (y VT * d)2

    3

  • 7/29/2019 4540SampleReport.doc

    5/20

    SIMULATION CASES

    Table 2 identifies the cases that were simulated. For example, Case 1involved setting the number of neurons in the hidden layer to 5, using a

    learning gain of 0.1, and using a momentum gain of 0.005 for a zeroeth ordermodel. At the beginning of each run the weights Wand Vwere initialized tothe same sets of random values, Wo and Vo, to maintain consistency from onerun to the next. Initial weight values can be found in the attached MATLABcode listing.

    Case Neurons inHidden Layer

    n_hidden

    Identifier Order Learning Gain kl Momentum Gain km

    1 5 0 0.1 0.005

    2 5 0 0.2 0.2

    3 5 1 0.1 0.005

    4 5 1 0.2 0.2

    5 5 2 0.1 0.005

    6 5 2 0.2 0.2

    7 5 3 0.1 0.005

    8 5 3 0.2 0.2

    9 5 4 0.1 0.005

    10 5 4 0.2 0.2

    11 5 5 0.1 0.005

    12 5 5 0.2 0.2

    Table 2. Simulation cases for Assignment 5.

    4

  • 7/29/2019 4540SampleReport.doc

    6/20

    SIMULATION RESULTS

    The following plots show the results of our simulation runs for the variouscombinations of neurons in the hidden layer, learning gains, momentum

    gains, and identifier order. The weights Wand Vwere always initialized tothe same sets of random values, Wo and Vo, to maintain consistency from onerun to the next.

    A plot for each case showing the results of the tracking and the trackingerror versus number of epochs is given in the following figures. Each plot hasthree subplots. In the first (upper) subplot the actual output (in red) isplotted along with the estimated output (in blue) as a function of time from49 to 51 seconds. Inspection of this subplot reveals whether or not theneural network was able to approximate the input and how well. We canmake qualitative evaluations of how well the tracker performs based onamplitude and phase differences between the two traces.

    The second (middle) subplot shows the tracking error (the simpledifference between actual and estimate) as a function of time, again from 49to 51 seconds. This plot yields a more quantitative evaluation of how thetracker performs.

    The third (bottom) subplot shows the tracking error (the simple differencebetween actual and estimate) as a function of time at every sampling instantfrom 0 to 80 seconds. This plot also yields a more quantitative evaluation ofhow the tracker performs. The title of the second subplot includes the meansquare error of the tracking error.

    5

  • 7/29/2019 4540SampleReport.doc

    7/20

    Figure 2. Simulation results for Case 1 (0th Order, kl= 0.1, km =0.005, and n_hidden = 5).

    6

  • 7/29/2019 4540SampleReport.doc

    8/20

    Figure 3. Simulation results for Case 2 (0th Order, kl= 0.2, km =0.2, and n_hidden = 5).

    Figure 4. Simulation results for Case 3 (1st Order, kl= 0.1, km =0.005, and n_hidden = 5).

    7

  • 7/29/2019 4540SampleReport.doc

    9/20

    Figure 5. Simulation results for Case 4 (1st Order, kl= 0.2, km =0.2, and n_hidden = 5).

    Figure 6. Simulation results for Case 5 (2nd Order, kl= 0.1, km =0.005, and n_hidden = 5).

    8

  • 7/29/2019 4540SampleReport.doc

    10/20

    Figure 7. Simulation results for Case 6 (2nd Order, kl= 0.2, km =0.2, and n_hidden = 5).

    9

  • 7/29/2019 4540SampleReport.doc

    11/20

    Figure 8. Simulation results for Case 7 (3rd Order, kl= 0.1, km =0.005, and n_hidden = 5).

    Figure 9. Simulation results for Case 8 (3rd Order, kl= 0.2, km =0.2, and n_hidden = 5).

    10

  • 7/29/2019 4540SampleReport.doc

    12/20

    Figure 10. Simulation results for Case 9 (4th Order, kl= 0.1, km =0.005, and n_hidden = 5).

    Figure 11. Simulation results for Case 10 (4th Order, kl= 0.2, km =0.2, and n_hidden = 5).

    11

  • 7/29/2019 4540SampleReport.doc

    13/20

    Figure 12. Simulation results for Case 11 (5th Order, kl= 0.1, km =0.005, and n_hidden = 5).

    Figure 13. Simulation results for Case 12 (5th Order, kl= 0.2, km= 0.2, and n_hidden = 5).

    12

  • 7/29/2019 4540SampleReport.doc

    14/20

    TABULATED RES ULTS

    The plots for the simulation cases show typical results with two differentcombinations of learning gain, momentum gain for each identifier order.

    Note that the 0

    th

    Order simulations correspond to the simulations run forAssignment 3. The mean square error was recorded for each case. Theresults are given in Table 3.

    Cases Neurons inHidden Layer

    n_hidden

    IdentifierOrder

    Mean SquareTracking Error

    (kl = 0.1, km =0.005)

    Mean SquareTracking Error

    (kl = 0.2, km =0.2)

    1 and 2 5 0 0.084708 0.016711

    3 and 4 5 1 0.036515 0.011746

    5 and 6 5 2 0.023515 0.010817

    7 and 8 5 3 0.013883 0.014667

    9 and 10 5 4 0.021601 0.025726

    11 and 12 5 5 0.016454 0.024176

    Table 3. Results for simulations tabulating mean square trackingerror.

    All simulations were run for 80 seconds of simulation time. From thistable we see that mean square tracking error decreases when the identifier

    order is increased from 0 to 2. A zeroeth order system provides only thecurrent value of x and the bias value of 1 as inputs to the ANN. Increasingthe order to 2 provides not only previous values ofx, i.e.,x(k-1),x(k-2), butalso previous values of the estimates, (k-1), (k-2) . These previous valuesallow the ANN to more accurately predict the next value. For example, withthe previous valuex(k-1) as well asx(k), a simple differencing yields the rateof change (or the velocity) ofx. Likewise, two previous valuesx(k-2) andx(k-1) as well asx(k) can be used to determine the rate of change of the velocity(acceleration). With higher orders more accurate prediction can take place.

    However, increasing the order of the identifier beyond 2 tends to increasethe mean square tracking error. For example, a fourth order identifier

    provides previous values ofx, i.e.,x(k-1),x(k-2), x(k-3),x(k-4), and previousvalues of the estimates, (k-1), (k-2) , (k-3), (k-4) . Increasing the numberof inputs to the ANN now has a deleterious effect on system performance.

    This is most likely due to the hidden layer having only five neurons. Thehigher order systems are presenting more information than five neurons canmanipulate. It is expected that increasing the number of neurons in thehidden layer would improve the performance of the higher order identifiers.

    13

  • 7/29/2019 4540SampleReport.doc

    15/20

    Figure 14 shows the results for Case 12, but now from 49 to 57 seconds.While the identifier is able to track the abrupt change, it has difficultymaintaining its tracking. Periodically, the tracking error spikes and thensettles down again. The bottom subplot in Figure 14 shows that thisphenomenon continues for the full 80 seconds of simulated time.

    Figure 14. Simulation results for Case 12 (5th Order, kl= 0.2, km =0.2, and n_hidden = 12) showing the estimate and tracking error

    from 49 to 57 seconds.

    14

  • 7/29/2019 4540SampleReport.doc

    16/20

    CONCLUSIONS

    For this assignment we used the neural network as a continuous onlineidentifier. After 50 seconds of training, the function of the plant was abruptly

    changed. We investigated various combinations of system order, learninggain, and momentum gain for both five neurons in the hidden layer.

    Improvements in performance were found when increasing the order ofthe identifier from 0 to 2. Increases in order of the identifier beyond 2 tendedto decrease system performance (increase mean square tracking error). It isstrongly suspected that this was due to having only five neurons in thehidden layer. It is expected that increasing the number of neurons in thehidden layer would improve the performance of higher order identifiers.

    Additional simulations would have to be performed to confirm this.

    15

  • 7/29/2019 4540SampleReport.doc

    17/20

    MATLAB SOURCE CODE

    %ANN DEFINITIONn_hidden = 5;n_out = 1;

    %LEARNING GAINkl = 0.2;

    %MOMENTUM GAINkm = 0.2;

    %USE SAME RANDOM NUMBER STREAM EVERY TIME FOR Wo AND Vo

    Wo = [ 0.9003, 0.8436, -0.7222, 0.6924, -0.3908,

    -0.0069, 0.6770, -0.6541, -0.9630, 0.7873, 0.4936,-0.2410;...

    -0.5377, 0.4764, -0.5945, 0.0503, -0.6207,0.7995, 0.1361, 0.9595, 0.6428, -0.8842, -0.1098,0.6636;...

    0.2137, -0.6475, -0.6026, -0.5947, -0.6131,0.6433, -0.2592, -0.4571, -0.1106, -0.2943, 0.8636,0.0056;...

    -0.0280, -0.1886, 0.2076, 0.3443, 0.3644,0.2898, 0.4055, -0.4953, 0.2309, 0.6263, -0.0680,0.4189;...

    0.7826, 0.8709, -0.4556, 0.6762, -0.3945,0.6359, 0.0931, 0.7515, 0.5839, -0.9803, -0.1627,-0.1422 ]Vo = [ -0.4312, -0.0616, -0.8704, 0.9767, 0.1656,

    -0.1530 ]

    W = Wo;V = Vo;

    %BACKPROPAGATION ALGORITHMmax_epoch = 8000;MSE_umbral = 1e-6;

    epoch = 1;MSE(epoch) = 1;dV = zeros(1, n_hidden + 1);dW = zeros(n_hidden, 12);

    %USE SAME RANDOM NUMBER STREAM EVERY TIME FOR X

    %RandStream.setDefaultStream(RandStream('mt19937ar','seed',2.3232e+005));%Initialize inputsx = 0.0;xm1 = 0.0;xm2 = 0.0;xm3 = 0.0;xm4 = 0.0;xm5 = 0.0;

    16

  • 7/29/2019 4540SampleReport.doc

    18/20

    y = 0.0;ym1 = 0.0;ym2 = 0.0;ym3 = 0.0;ym4 = 0.0;ym5 = 0.0;

    %while ((epoch < max_epoch + 1) & (MSE(epoch) > MSE_umbral))

    while (epoch < max_epoch + 1)

    %x(1,1) = 2 * rand(1,1) - 1; %y = 2 * x.^2 + 1;

    %Update ANN inputsxm5 = xm4;xm4 = xm3;xm3 = xm2;xm2 = xm1;xm1 = x;

    x = sin(2*pi*epoch/100.0);

    ym5 = ym4;ym4 = ym3;ym3 = ym2;ym2 = ym1;ym1 = y;

    if ( epoch

  • 7/29/2019 4540SampleReport.doc

    19/20

    dW(k,:)=kl*(e*V(k+1))*d(k)*(1-d(k))*X + km*dW(k,:); end;

    W = W +dW;

    %Sum of Square Errors: %xe(epoch,1) = x; %ye(epoch,1) = y;

    %if(epoch < 500), % Xe = [ones(epoch,1) xe]; % Ye = ye; %else % Xe = [ones(500,1) xe(epoch-499:epoch,1)]; % Ye = ye(epoch-499:epoch,1); %end;

    %Ae = W*Xe'; %De = 1 ./(1 + exp(-Ae)); %Yest = (V*[ones(1,length(Xe(:,1))); De])'; %E = Ye - Yest;

    %MSE(epoch+1) = mean(E.^2); %MSE(epoch) = mean(E.^2);

    epoch = epoch + 1;end;% Save final values of weightsWf = W;Vf = V;

    % Find the mean square of the tracking errorMSEm = mean( TE.^2 );

    %PLOTS

    %N=100;N = max_epoch;

    %xp = [-1:2/N:1]'xp = [0.01 : 0.01 : max_epoch/100.0]';

    %yp = 2*xp.^2 + 1;%yp = 2 * sin( 2*xp.^2 + 1 );%Xp=[ones(N+1,1) xp];%Ap = W*Xp';%Dp = 1./(1 + exp(-Ap));%Yp = (V*[ones(1,N+1);Dp])';MSEp = min(MSE);MSEs = num2str(MSEp);

    % Plot from 49 to 51 secondssubplot(3,1,1);plot(xp(4900:5100),yp(4900:5100),'r',xp(4900:5100),Yp(4900:5100));title(strcat('Fifth Order Model, y(x)=2x^2+1 (red) and Estimated

    output (blue), n_h_i_d_d_e_n = ', num2str(n_hidden), ', k_l = ',num2str(kl), ', k_m = ', num2str(km) ));xlabel( 'Time (sec)' );ylabel('y(x)=2x^2+1,y_e_s_t_i_m_a_t_e_d');

    18

  • 7/29/2019 4540SampleReport.doc

    20/20

    % Plot tracking error from 49 to 51 secondssubplot(3,1,2);plot(xp(4900:5100), TE(4900:5100), 'r');title('Tracking Error');xlabel('Time (sec)');ylabel('Tracking Error');

    % Plot Tracking Errorsubplot(3,1,3);title( strcat( 'Tracking Error, Mean Square Error = ', num2str(MSEm) )

    );xlabel('Time (sec)');ylabel('Tracking Error');hold on;plot(xp(1:8000), TE(1:8000), 'r');hold off;

    MSEm

    19