chapter 3 system id

Upload: yusron-mustain

Post on 05-Apr-2018

222 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/2/2019 Chapter 3 System ID

    1/24

    Chapter 3 System ID

    1. Scalar case, one unknown parameter, intuitive method 12. Scalar case, one unknown parameter, intuitive method 23. Scalar case, one unknown parameter, un-normalized4. Scalar case, one unknown parameter, normalized5. First order linear system-scalar case, un-normalized6. First order linear system-vector case, un-normalized7. First order linear system-vector case, normalized8. SPR Lyapunov approach, normalized9. Gradient method, normalized

    10. Least squares, normalized

    Chapter 3 System ID for adaptive control

    1. Scalar case, one unknown parameter, intuitive method 12. Scalar case, one unknown parameter, intuitive method 23. Scalar case, one unknown parameter, un-normalized method4. Scalar case, one unknown parameter, normalized method5. First order linear system-scalar case, un-normalized6. First order linear system-vector case, un-normalized7. First order linear system-vector case, normalized8. SPR Lyapunov approach, normalized9. Gradient method, normalized

    10. Least squares, normalized

  • 8/2/2019 Chapter 3 System ID

    2/24

    1. Scalar case, one unknown parameter, intuitive method 1System : y (t) = .u (t)Where : y (t) = output

    u (t) = input

    = unknown constantAssume that y (t) and u (t) are measurable t to and it is desired to obtain an estimate of.Solution : Get the 1/0 pair {u (t) , y (t)} over an interval [, + T] so that u (t) 0 forsome t [, + T] and choose as , [, + T]Remark :

    u (t) maybe very close to zero = very big! UnboundedRemark :It would be worse if the system is subjected to noise

    The system would result in a large spread in the value of at different t [, + T]Example : y (t) = . u (t) + d

    Review

    y (t) =

    . u(t)

    if u0 the problem is = of u0 ; may have measuring error/noise by u

    2. IDScalar case, one unknown parameter, 2nd intuitive methodLets define an estimator with system : y = . u(t) t [, + T]The solution maybe considered as one that minimizes the performance index (p1) :

    Where [ ] Its minimization is evaluated as :

    [

    ]

  • 8/2/2019 Chapter 3 System ID

    3/24

    for u (t) = 0 , t [, + T]Remark : as in the previous method, can only be calculated once if no noise presentRemark : If noise shows up, it might not gine true ofRemark : Increasing of I will improve To anticipate it, we meed RLS : recursive lee square!

    Because its impossible to increment the function to another time interval

    3. IDScalar case, one unknown parameter, unvormalized (the system)System y(t) = Where u . is an unknown constant and u,y are measurableWe would like to determine from the measured datalets consider the estimated output : Define

    output error

    = = - ( ) u= -

    Define J( performance J( is pointuise of J=

    =

    J( is convex over and hence the minimizationof J is global convex optimization problemECX is said to be convex if [0,1] such that

    = * +=

  • 8/2/2019 Chapter 3 System ID

    4/24

    Stability of the estimator above can be proved as :

    Let V (

    V(x) is p.d if 0 V (x) = 0 x = 0 0 V (x) > 0 x

    parameter error is bounded ( Check (proef)

    =

    <

    Check =

    If then 0 as t = y = u

    J =

    Because of : V =

    If

    =

    as t =

    When will as t = ?

    as t = (Barbalats lemma)

  • 8/2/2019 Chapter 3 System ID

    5/24

    A necessary and sufficient condition is that u satisfies the PE condition

    and for some PE = Persistent ExatationU should be rich enough to have parameter convergence

    4. IDScalar case, one unknown parameter, normalizedSystem y(t) = Where is an unknown constant and u,y are measurable u

    Since now u could be unbounded, we may not set

    Because it would give an unbounded Let us divide the system equation by the normalizing signal m > 0To have Where and , and m2 = 1 + The signal ns is chose so that e.g. pick ns = u

    m2

    = 1 +

    u

    2

    Consider the estimate Let normalized output error

    = =

    J( =

    =

    =

  • 8/2/2019 Chapter 3 System ID

    6/24

    = = m > 0

    Using the gradient method

    = * += * +

    = Let V =

    = = -

    <

    = if

    then as t = as t =

    For parameter convergence, me need PEPersistent excitation

    ID of 1st

    order linear systems (unnormalized)

    Plant : Where Xp R is the output of the plant and u R is the bounded inputAp and Kp are unknown constants, but me know that ap < 0

    The problem is to determine ap and kp from u and xp

  • 8/2/2019 Chapter 3 System ID

    7/24

    Here, we are going to solve this problem with 2 estimators :

    1. Parallel estimatoroutput error methodonly for scalar system (not for vector)2. Seriesparallel estimatorequation error methodfor scalar and vector systemsMethod 1 : - output error method

    Select the parallel estimator u - error dynamicsWhere e =

    , (The output error dynamics is driven by the output error)And parameter error Y (t) = u not discussed yet

    5.

    ID of 1

    st

    order linear system (unnormalized)Plant = Where Xp IR , u R are measurableAp and Kp are unknown constants, ap < 0

    1. Parallel estimatoroutput error method (only for scalar)2. Seriesparallel estimatorequation error method (also for method)

    Method : - output error method

    Select the parallel estimator

    u - error dynamics

  • 8/2/2019 Chapter 3 System ID

    8/24

    Define a Lyapunov function candidate

    Take the time derivate of V along the trajectory of the errors dynamics, we have : Pick

    Bounded

    U Xp

    Stable

    LTI System

    BI

    Bounded

    Input

    BIBO

    Stable

    BO

    Bounded

    Output

  • 8/2/2019 Chapter 3 System ID

    9/24

    We need PE for parameter Convergence

    Summary :

    For parameter Convergence, we need PE

    For parameter Convergence, we need PE

    Method 2 : - Equation error method

    Select the series-paralel estimator : [ ] [ ]

  • 8/2/2019 Chapter 3 System ID

    10/24

    =[ ] Pick

    t < 0 . We need PE for convergence of the parameters.

    ID VECTOR CASE UN-FORMALIZED

    Plant = Where Assume that

    is Hurwitz and u is bonded.

    We would like to find and , based on the information of and u.Method : Output error methodDefine the parallel estimator :

    The error dynamics can be as : Pick the Lyapinov function candidate : = =

    ?

    Although is known to be stable, its elements are not known and hence a matrix p > 0 cant be found

    s.t. with Q > 0.This approach doesnt work for veetos case but its fine for scalar case.

    Method 2 : Equation error method

    Define the series-parallel estimator Where and and is known Let V =

    =

  • 8/2/2019 Chapter 3 System ID

    11/24

    = - Q

    is stable s, t,

    Method 2 :

    Pick

    because q is a constant

    We need PE for parameter converage!

    What about u 7. ID-1st order systems, normalized u

    Plant : unknown constansLet + Define e = x - Where s is the laplace variable

    Ns is to be designed s.t.

    and

    with

    and ns =

    Update laus

  • 8/2/2019 Chapter 3 System ID

    12/24

    (s-am)e = (S-am) . (x - )e. - ) am (x - )e.

    -

    = (axbu)(am

    + (

    -am) x +

    = am (x - ) + (a- x + (b-)u (x - ) + (a-x + (b-)uam (x - ) - e. = (am - e + (a-x + (b-)u = (am - e + +

    Let V=

    -

    [ ] (am - ) xe ue (am - E,

    ||

    For some n > 0

    =

    E = (am - e + + We can not conclude asympt stability of e

    e.ns :v

    Q.E.D

  • 8/2/2019 Chapter 3 System ID

    13/24

    e. m

    = <

  • 8/2/2019 Chapter 3 System ID

    14/24

    = =

    In general, we represent a linear parametric model as :

    Y = G(s) . Review e = x- updated laws : Positive Linear System SPR : Strictly positive RealPositive linear systems are linear systems with positive real T.F.s

    System :

    (n-th order SISO linear system)Def : h(s) is positive real (PR) if Re [h(s)] 0 Re [s] 0h(s) is strictly positive real (SPR) if h (s-) is PR for some > 0

    e.q. h(s) = > 0, S=

    h( ) = =

    If Re [h(s)] = Then h(s) is PR

    For SPR :

    Pick h(s - ) =

    is P.R

  • 8/2/2019 Chapter 3 System ID

    15/24

    is SPR

    Thm : h(s) is SPR iff

    h(s) is strictly stable, and

    Re [h(j)] > 0 Some useful necessary conditions to check SPR :1. h(s) strictly stable

    e.g h(s) = is not SPR

    because the evalue is not in the left hand plane

    2. h(s) has relative degree 0 or 1e.g h(s) =

    is not SPR

    3. h(s) is strictly minimum phasee.g h(s) =

    is not SPR4. the ngguist plot of h(j) lies entirely in RHP (RightHand - Plane)5. phaseshift in response to sinusoidal input is always less than 900the kalman-yakuborich lemma (positive real lemma)

    lemma :

    system :

    y = (A,B) controllable

    The T.F. h(s) = is SPRIff p > 0 and Q > 0 such that (s.t)

    PB = C

    Controllable stabilizable uncontrollable

    Uncontrollable not stabilizable

    PCstabilizable

    Costabilizable is continuousstabilizable

    stabilizable

    stabilizable f is smooth

  • 8/2/2019 Chapter 3 System ID

    16/24

    p.c. stabilizable piece wise continuousmodified k-y-lemma

    kaliman-Yakubovich-Meyer Lemma

    given x

    0 A asymptotically stable, symmetric p.d. matrix L

    if the T.F. is SPR, then vector q and p = PT > 0 s.t

    PB = C + 8. IDGeneral SISO plantSPR Lyapunov Approach, normalized

    Plant : y = G(s) . , y

    Choose L(s) s.t. L-1

    (s) is a proper stable T.F. and G(s) . L(s)

    is a proper SPR T.F

    y = G (s) . L(s) . = G (s) . L(s) .

    e.q : y =

    = * += G(s).

    Let L = s+2 L-1

    =

    Y = G L L-1

    = * +

    =

    An estimator can be constructed as Case 1 : G(s) is minimum phase

    Pick L (s) = G-1

    (s) s.t . GL = 1

  • 8/2/2019 Chapter 3 System ID

    17/24

    Define

    etc

    Case 2 :

    G(s) is not required to be minimum phase

    Define the normalized error

    GL e

  • 8/2/2019 Chapter 3 System ID

    18/24

    [ ] [ ]

    Realization to state-space function

  • 8/2/2019 Chapter 3 System ID

    19/24

    1DGradient Method - normalized

    System:

    In parametic model

    Two approaches are introduced below based on different cost-function

    Case 1 :Instantaneous cost function

    Convexity of guarantees the existense of a unique globalMinimum devine by :

  • 8/2/2019 Chapter 3 System ID

    20/24

    (1)Non-recursive algorithm 1 : * +

    (2)Non recursive alogarithm 2:

    Adapptive control of linear-time-varying system (phD Thesis)

    Joannou and Tsaklis

  • 8/2/2019 Chapter 3 System ID

    21/24

    (3)Recursive Algorithm

    March 30, 2006

  • 8/2/2019 Chapter 3 System ID

    22/24

    * +

    ( )

    }**it means : to make

  • 8/2/2019 Chapter 3 System ID

    23/24

    Case 2 : Integral Cost Function

    [ ] [

    ]

    [ ]

    { } {

    }

    Homogeneous

  • 8/2/2019 Chapter 3 System ID

    24/24

    Note that R(t) and Q(t) are exactly solutions to the following two

    Non-homogenous D.E. (differential equation) respectively

    { }