sensor fusion based missile guidance

8
Sensor Fusion Based Missile Guidance * Pubudu N Pathirana School of Engineering and Technology, Deakin University, Geelong, Victoria 3217, Australia [email protected] Andrey V Savkin School of Electrical Engineering and Telecommunications, University of New South Wales, Sydney 2052, Australia [email protected] Abstract – In this paper, we improve the guidance system performance via sensor fusion techniques. Vision based guidance systems can be improved in performance via radar tacking or employing video tracking by unmanned flying vehicles. We also introduce an image texture gradi- ent based image segmentation technique to identify the tar- get in a typical surface-to-air type application with the pro- posed Robust Extended Kalman Filter based state estima- tion technique for the implementation of the Proportional Navigation guidance controller. Keywords: Sensor fusion, Robust Extended Kalman Fil- ter, Missile Guidance. 1 Introduction The need for improvement in the missile guidance sys- tems was brought about as a result of recent developments in weapon system and sub-system technologies as well as a shift in guided weapon system deployment and operational philosophies. In the past, due to real-time computing constraints, ma- jor simplification of engagement kinematics model, perfor- mance index and constraints had to be implemented in or- der to render the solution suitable for mechanization of a real system. With recent technological advances, particu- larly in computing, the past constraints do not apply. It is now feasible to look at guidance strategies that are not only more efficient in achieving miss distances in order to maximize lethality, but also economical and simpler in pro- duction. Onboard video cameras can be used for missile guidance replacing the onboard radar systems specially in the terminal phase of a surface to air missile. The passive video imaging is less expensive while it consumes lesser energy and hence simplifies the onboard electronics con- siderably. In missile systems, not all the states are available for measurement. Images from a video signal can be used to * This work is supported by the Australian Research Council estimate the target in the image plane and consequently this information can be used to estimate the state variables in the guidance system. A moving target captured by an onboard camera mounted on a SAM typically consists a target mov- ing in a background uniformly moving in a specific direc- tion. Considering the ideas of perspective projection of ob- ject on to the image plane via simple geometrical optic con- cepts, the resulting nonlinear state estimation problem can be addressed by Robust Extended Kalman Filter(REKF) which is based on new theoretical developments presented in [1, 2]. The systems performance can be improved by incorporating additional radar measurement or employing additional cheaper unmanned flying vehicles(UFV) or mul- tiple missiles[3]. In this application we consider the Pro- portional Navigation(PN) guidance as the controller for its wide usage although a robust H type controller taking ac- count of uncertain target maneuver can also be relevant[4]. Firstly, we introduce the missile/target kinematics in 3D case. Two example have been chosen for simulation to demonstrate the advantages of information fusion and mul- tiple missiles in the missile guidance problem. Firstly, we compare the performance of a video guided missile together with such a missile assisted by a ground based radar station. Secondly, we consider the improvement of a video guided missile by using video captured by additional unmanned flying vehicles. Here, we consider the terminal phase of the Surface to air missile and assume non-rotational flight of the missile for the sake of simplicity which is a fair as- sumption for most cases in the considered flight phase. The same ideas can be extended for the flight with consideration for the rotational kinematics. 2 Missile Target Kinematic Model Let the position vector of the missile and the target with respect to the earth frame be (x M 1 ,x M 2 ,x M 3 ) and (x T 1 ,x T 2 ,x T 3 ) respectively. With the target/missile state de- fined as x T /M E =[x T /M 1 x T /M 2 x T /M 3 ˙ x T /M 1 ˙ x T /M 2 ˙ x T /M 3 ] 0 ,

Upload: others

Post on 18-Dec-2021

21 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Sensor Fusion Based Missile Guidance

SensorFusion Based Missile Guidance∗

Pubudu N PathiranaSchool of Engineering and Technology,

Deakin University,Geelong, Victoria 3217,

[email protected]

Andrey V SavkinSchool of Electrical Engineering

and Telecommunications,University of New South Wales,

Sydney 2052, [email protected]

Abstract – In this paper, we improve the guidance systemperformance via sensor fusion techniques. Vision basedguidance systems can be improved in performance viaradar tacking or employing video tracking by unmannedflying vehicles. We also introduce an image texture gradi-ent based image segmentation technique to identify the tar-get in a typical surface-to-air type application with the pro-posed Robust Extended Kalman Filter based state estima-tion technique for the implementation of the ProportionalNavigation guidance controller.

Keywords: Sensor fusion, Robust Extended Kalman Fil-ter, Missile Guidance.

1 IntroductionThe need for improvement in the missile guidance sys-

tems was brought about as a result of recent developmentsin weapon system and sub-system technologies as well as ashift in guided weapon system deployment and operationalphilosophies.

In the past, due to real-time computing constraints, ma-jor simplification of engagement kinematics model, perfor-mance index and constraints had to be implemented in or-der to render the solution suitable for mechanization of areal system. With recent technological advances, particu-larly in computing, the past constraints do not apply. Itis now feasible to look at guidance strategies that are notonly more efficient in achieving miss distances in order tomaximize lethality, but also economical and simpler in pro-duction. Onboard video cameras can be used for missileguidance replacing the onboard radar systems specially inthe terminal phase of a surface to air missile. The passivevideo imaging is less expensive while it consumes lesserenergy and hence simplifies the onboard electronics con-siderably.

In missile systems, not all the states are available formeasurement. Images from a video signal can be used to

∗Thiswork is supported by the Australian Research Council

estimate the target in the image plane and consequently thisinformation can be used to estimate the state variables in theguidance system. A moving target captured by an onboardcamera mounted on a SAM typically consists a target mov-ing in a background uniformly moving in a specific direc-tion. Considering the ideas of perspective projection of ob-ject on to the image plane via simple geometrical optic con-cepts, the resulting nonlinear state estimation problem canbe addressed by Robust Extended Kalman Filter(REKF)which is based on new theoretical developments presentedin [1, 2]. The systems performance can be improved byincorporating additional radar measurement or employingadditional cheaper unmanned flying vehicles(UFV) or mul-tiple missiles[3]. In this application we consider the Pro-portional Navigation(PN) guidance as the controller for itswide usage although a robustH∞ type controller taking ac-count of uncertain target maneuver can also be relevant[4].Firstly, we introduce the missile/target kinematics in 3Dcase. Two example have been chosen for simulation todemonstrate the advantages of information fusion and mul-tiple missiles in the missile guidance problem. Firstly, wecompare the performance of a video guided missile togetherwith such a missile assisted by a ground based radar station.Secondly, we consider the improvement of a video guidedmissile by using video captured by additional unmannedflying vehicles. Here, we consider the terminal phase ofthe Surface to air missile and assume non-rotational flightof the missile for the sake of simplicity which is a fair as-sumption for most cases in the considered flight phase. Thesame ideas can be extended for the flight with considerationfor the rotational kinematics.

2 Missile Target Kinematic ModelLet the position vector of the missile and the target

with respect to the earth frame be (xM1 , xM

2 , xM3 ) and

(xT1 , xT

2 , xT3 ) respectively. With the target/missile state de-

fined asxT/ME = [xT/M

1 xT/M2 x

T/M3 x

T/M1 x

T/M2 x

T/M3 ]′,

Phil Dauwalder
253
Phil Dauwalder
Phil Dauwalder
Page 2: Sensor Fusion Based Missile Guidance

X

Y

Z

Ground based Radar

Missile

Target

Figure2.1: Missile navigation with ground radar station

the target missile relative statex can be stated as :

x = xTM = xT

E − xME (2.1)

for systemx = Ax + B1u + B2w (2.2)

with system matrices

A =

0 0 0 1 0 00 0 0 0 1 00 0 0 0 0 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 0

B1 =

0 0 00 0 00 0 0−1 0 00 −1 00 0 1

, B2 =

0 0 00 0 00 0 01 0 00 1 00 0 1

.(2.3)

with u = [ax ay az]′ andw being the control input vectorand the target maneuver respectively. The system given inequation 2.2 represent the missile target kinematic modelfor both cases that we consider in this paper.

3 Measurement EquationPerspective projection reflects image formation using an

ideal pinhole camera according to the principals of geomet-rical optics[5]. The following algebraic relations describingthe perspective transformation can be given.

Let the point-wise target location in the image plane be(x1, x2) with the focal length of the cameraf . Assuming

X

Y

Z

Missile UAV

UAV Target

Figure2.2: Missile navigation with unmanned aerial vehi-cles(UAV)

x 3

f

(x1,x2)

T T(x ,x ,x )

T

E E E

(x ,x ,x )1 2 3

Figure3.1: Projection of a point wise target on to the imageplane

a non rotating frame attached to the missile, the measure-ments[x1 x2 x1 x2] are :

x1 = fx1f−x3

x2 = fx2f−x3

(3.1)

With primarily a video based sensing system, here weconsider two instances. In case I, the video based missileperformance is improved via employing a ground radar sta-tion(figure 2.1). In case II, the video based missile perfor-mance is enhanced by the employing two unmanned flyingvehicles for video capturing(figure 2.2) with enhanced spa-cial diversity. The measurement equation for each case isin the form of :

y = β (x(t)) + v (3.2)

Phil Dauwalder
254
Page 3: Sensor Fusion Based Missile Guidance

where

β(x) =[fx1(t)

(f−x3(t))fx2(t)

(f−x3(t))

], single missile

fx1(t)(f−x3(t))

fx2(t)(f−x3(t))

x1(t)x2(t)x3(t)

, with video and radar (3.3)

fx1(t)(f−x3(t))

fx2(t)(f−x3(t))

fx1(t)+xM1 (t)−x1

1(t)

(f−(x3(t)+xM3 (t)−x1

3(t)))fx2(t)+xM

2 (t)−x12(t)

(f−(x3(t)+xM3 (t)−x1

3(t)))fx1(t)+xM

1 (t)−x21(t)

(f−(x3(t)+xM3 (t)−x2

3(t)))fx2(t)+xM

2 (t)−x22(t)

(f−(x3(t)+xM3 (t)−x2

3(t)))

missile with UFV

andx is the state of the missile target dynamic system andxM is the missile state andxi is the state of theith UAV.

Here, we assume the rotational kinematics of the missileis negligible for the sake of simplicity and the same ideascan be extended for the case with rotational dynamics ac-counted.

3.1 Target LocalizationOptical flow is the apparent motion of the bright-

ness/intensity patterns observed when the camera is movingrelative to the objects being imaged[5, 6]. LetI(x1, x2, t)denote the image intensity function at timet at the imagepoint (x1, x2). Assuming that the overall intensity of theimage is time independent, the well known Optical flowequation can be written as

∂I

∂x1

dx1

dt+

∂I

∂x2

dx2

dt+

∂I

∂t= 0 (3.4)

If the optical flow,v = (v1, v2)′ = (dx1/dt, dx2/dt)′ the

equation 3.4 can be written as

u2 = mu1 + c (3.5)

where m = −∂I/∂x1∂I/∂x2

and c = − 1∂I/∂x2

∂I∂t . The tex-

ture gradientm can be used to segment the video streamto localize the target in the image plane. In a practicalsystem, as the computational efficiency is crucial, we pro-pose using this texture gradient rather than calculating com-putationally taxing the overall optical flow(magnitude anddirection)[7, 8, 9]. Local and global techniques in calculat-ing the optical flow for the ill-posed problem are presentedin [9] and in [8] respectively. The resultant is low pass fil-tered to isolate the target.

4 Proportional NavigationSince the first successful test of the Lark missile in De-

cember 1950, Proportional Navigation(PN) has come to bewidely employed by homing missiles. Specially in the caseof intercepting airborne targets in the case of SAMs PNGhas been proved as a useful guidance scheme[10]. Underthis scheme, the missile is commanded to turn at a rate pro-portional to the angular velocity of the line of site(LOS).The missile normal acceleration is given by :

uN = NVcλ (4.1)

where, N is the navigation constant,λ is the LOS an-gle andVc is the closing velocity. PNG works ideally forconstant speed maneuver targets. Extensions to PNG hasbeen proposed when the target acceleration is significant:augmented PNG(APNG)[11], biased PNG(BPNG)[12] tohit with a desired impact angle can be used. A de-tailed description on Pure PNG(PPN)[13, 14] and TruePNG(TPN)[13, 14, 10]. In our implementation, we use theAPNG with effective navigation ratio of 3, presented[15] inthe form of

u =3t2g

[I3 tgI3 0] x (4.2)

where,tg is the estimated time to go, andx is the estimatedstate.

5 Set-Value state estimation with anon-linear signal model

We consider nonlinear uncertain system of the form

x = A(x, u) +B2w

z = K(x, u) (5.1)

y = C(x) +v,

as a general form of the system given by equation 2.2, anddefined on the finite time interval[0, s] . Here,x(t) ∈ Rn

denotes thestateof the systemy(t) ∈ Rl is themeasuredoutput andz(t) ∈ Rq is theuncertainty output.The un-certainty inputs arew(t) ∈ Rp and v(t) ∈ Rl. Also,u(t) ∈ Rm is the knowncontrol input. We assume thatall of the functions appearing in (5.1) are with continu-ous and bounded partial derivatives. Additionally, we as-sume thatK(x, u) is bounded. This was assumed to sim-plify the mathematical derivations and can be removed inpractice[2][16]. The matrixB2 is assumed to be indepen-dent ofx, and is of full rank.

Phil Dauwalder
255
Page 4: Sensor Fusion Based Missile Guidance

50 100 150 200 250 300 350 400 450

50

100

150

200

250

300

350

400

450

500

550

50 100 150 200 250 300 350 400 450

50

100

150

200

250

300

350

400

450

500

550

A : Launch phase image B: Mid-course image

−10 0 10 20 30 40 50 60−5

0

5

10

15

20

25

30

Image X axis

Ima

ge

Y a

xis

−10 0 10 20 30 40 50 60−5

0

5

10

15

20

25

30

Image X axis

Ima

ge

Y a

xis

C :Texture gradient -m (A) D: Texture gradient - m (B)

010

2030

4050

6070

80

0

50

100

150

20060

80

100

120

140

160

180

200

220

240

260

Image X axisImage Y axis

Filt

ere

d d

irectional optical flow

(A

ngels

in d

egre

es)

020

4060

80100

120

0

50

100

150

0

50

100

150

200

250

Image X axis

Image Y axis

Filt

ere

d d

irectional optical flow

(A

ngle

s in d

egre

es)

C :Filtered texture gradient(m) (A) D: Filtered texture gradient(m) (B)

50 100 150 200 250 300 350 400

50

100

150

200

250

300

350

400

450

500

550

50 100 150 200 250 300 350 400

50

100

150

200

250

300

350

400

450

500

550

C :Target localized(A) D: Target localized(B)

Figure 5.1: Target Localization (The video streams are courtesy ofthe Missile Defence Agency’s(MDA) Program Video Library(LEAP testflight), http://www.acq.osd.mil/bmdo/bmdolink/html/video.html).

Phil Dauwalder
256
Page 5: Sensor Fusion Based Missile Guidance

Theuncertainty in the system is defined by the followingnonlinear integral constraint[1, 2, 17, 18, 19] :

Φ(x(0)) +∫ s

0

L1 (w(t), v(t)) dt ≤ d +∫ s

0

L2 (z(t)) dt,

(5.2)whered ≥ 0 is a positive real number. Here,Φ, L1

andL2 are bounded non-negative functions with continu-ous partial derivatives satisfying growth conditions of thetype

‖φ(x)− φ(x′)‖ ≤ β

(1 + ‖x‖+ ‖x′‖

)‖x− x

′‖(5.3)

where ‖ · ‖ is the euclidian norm withβ > 0, andφ = Φ, L1, L2. Uncertainty inputsw(· ), v(· ) satisfyingthis condition are calledadmissible uncertainties.We con-sider the problem of characterizing the set of all possiblestatesXs of the system (5.1) at times ≥ 0 which are con-sistent with a given control inputu0(· ) and a given outputpathy0(· ) ; i.e.,x ∈ Xs if and only if there exists admissi-ble uncertainties such that ifu0(t) is the control input andx(· ) andy(· ) are resulting trajectories, thenx(s) = x andy(t) = y0(t), for all 0 ≤ t ≤ s.

5.1 The State EstimatorThe state estimation setXs is characterized in terms of

level sets of the solutionV (x, s) of the PDE

∂tV + maxw∈Rm {∇xV.

(A(x, u0) + B2w

)

−L1

(w, y0 − C(x)

)+ L2

(K(x, u0)

) } = 0V (· , 0) = Φ. (5.4)

The PDE (5.4) can be viewed as a filter, taking observa-tionsu0(t), y0(t), 0 ≤ t ≤ s and producing the setXs as aoutput. The state of this filter is the functionV (· , s) ; thusV is an information state for the state estimation problem.

Theorem 5.1 Assume the uncertain system (5.1), (5.2) sat-isfies the assumptions given above. Then the correspondingset of possible states is given by

Xs = {x ∈ Rn : V (x, s) ≤ d} , (5.5)

whereV (x, t) is the unique viscosity solution of (5.4) inC (Rn × [0, s]) .

proof see [2].

5.2 A Robust Extended Kalman FilterHere we consider an approximation to the PDE (5.4)

which leads to a Kalman filter like characterization of thesetXs. Petersen and Savkin in [2] presented this as a Ex-tended Kalman filter version of the solution to the Set ValueState Estimation problem for a linear plant with the uncer-tainty described by an Integral Quadratic Constraint (IQC).

This IQC is also presented as a special case of equation 5.2.We consider uncertain system described by (5.1) and an in-tegral quadratic constraint of the form

(x(0)− x0)′X0 (x(0)− x0)

+12

∫ s

0

(w(t)

′Q(t)w(t)

)+ v(t)

′R(t)v(t)dt

≤ d +12

∫ s

0

z(t)′z(t)dt. (5.6)

whereN > 0, Q > 0 andR > 0. For the system (5.1),(5.6), the PDE (5.4) can be written as

∂∂tV +∇xV.A(x, u0) + 1

2∇xV B2Q−1B

′2∇xV

− 12

(y0 − C(x)

)′R

(y0 − C(x)

)

+ 12K(x, u0)

′K(x, u0) = 0.

V (x, 0) = (x− x0)′N (x− x0) . (5.7)

Considering a functionx(t) defined as x(t) ,arg minx V (x, t), with the following equations (5.8),(5.9)and (5.10) define our approximate solution to the PDE (5.7):

˙x(t) =

A(x(t), u0

)+ X−1[∇xC (x(t))

′R

(y0 − C (x(t))

)

+∇xK(x(t), u0

)′K

(x(t), u0

)].

x(t) = x0, (5.8)

X(t) is defined as the solution to the Riccati DifferentialEquation (RDE)

X +∇xA(x, u0

)′X + X∇xA

(x, u0

)

+XB2Q−1B

′2X −∇xC(x)

′R∇xC(x)

+∇xK(x, u0

)′ ∇xK(x, u0

)= 0.

X(0) = N (5.9)

and

φ(t) , 12

∫ t

0

[(y0 − C(x)

)′R

(y0 − C(x)

)

−K(x, u0

)′K

(x, u0

)]dτ. (5.10)

The functionV (x, t) was approximated by a function ofthe form

V (x, t) =12

(x− x(t))′X(t) (x− x(t)) + φ(t).

Hence, it follows from Theorem 5.1 that an approximateformula for the setXs is given by

Xs ={x ∈ Rn :

12

(x− x(s))′X(s) (x− x(s)) ≤ d− φ(s)

}

This amounts to the so called Robust Extended KalmanFilter generalization presented in[2].

Phil Dauwalder
257
Page 6: Sensor Fusion Based Missile Guidance

Parameter Value Comments

N 0.1× I6 Weighting on the

Initial viscosity solution

Q 10× I3 Weighting on the uncertainty

uncertaintyin the driving command

R 0.1× I2, I5 Weighting on the uncertainty

uncertaintyin the measurements

Am 0.1 Amplitudeof the

target maneuver

T 12s Flight time

x0 [10003000 3000 Initial state

0 0 2000]m

Table 6.1: Simulation parameters for case I.

Parameter Value Comments

N 0.1× I6 Weighting on the

Initial viscosity solution

Q I3 Weighting on the

uncertaintyin

missiledriving command

R 107 × I2, I6 Weighting on the

uncertaintyin the measurements

Am 0.1 Amplitudeof the target maneuver

T 12s Flight time

x0 [10003000 3000 Initial state

0 0 2000]m

Table 6.2: Simulation parameters for case II.

6 SimulationsWe have carried out simulations for target localization

in the video stream. Initially, the texture gradient(minequation 3.5) has been calculated for every consecutive im-ages(see A,B in figure 5.1) and then low passed filtered us-ing a filter with a pass band less than 0.2 of the normalizedfrequency. C,D of figure 5.1 shows the directional texturegradient vectors. Location of the maximum of the filteredimage corresponds to the target as shown in E,F of figure5.1. The noise in the images are assumed only to be anybounded function of time and space and hence the estimatedtarget locations are subjected to bounded functions in time.This robust assumptions in inline with the robust extendedKalman filter assumptions as stated in section 5.To demonstrate the advantage of using information fusionfor video and radar sensors to improve the systems perfor-mance, we simulate a missile with an onboard camera as-sisted by radar ground station. The equation for the stateestimation and the corresponding Riccati Differential equa-tion obtained from equation 5.8 and 5.9 are as follows :

˙x(t) = Ax(t) + B1u(t)+X−1(t)[β1x(t)′R (y(t)− β (x(t)))]

x(t) = x0, u(t) = 0. (6.1)

X + A′X + XA + XB2Q−1B′

2X−β1x(t)′Rβ1x(t) = 0

X(0) = N, (6.2)

(6.3)

whereβ(x) is given in equation 3.3with,

β1(x) = ∇xβ(x). (6.4)

The simulation parameters used are given in table 6.1 andthe significant improvement is shown in figure 7.1. Forthe second system we simulate the same maneuveringtarget and the parameters used are given in the table 6.2.The trajectory of each ariel body for a typical case of amaneuvering target (w= 0.5rad/sec) is shown in figure7.2. The improvement in the system performance is shownin figure 7.3.

For both cases we consider, the target maneuver we em-ployed, is given by

w(t) = Am[ −t sin (ωt) 20 sin (ωt) 0

]′(6.5)

for a range of maneuver frequencies(ω). Table 6.1 gives theparameter values used in the simulations.

7 ConclusionIn this paper we considered the improvement of missile

performance of a surface to air missile using informationfusion ideas. For economic and availability reasons, videobased guidance is becoming a key areas of interest in the de-fence industry and we investigated the issue of improvingperformance of such systems via employing sensor fusionideas. The two cases we consider(video imaging incorpo-rated radar and video receiving with multiple vehicles) en-hanced the video based guidance systems performance sig-nificantly.

References[1] A.V Savkin and I.R Petersen. Recursive state estima-

tion for uncertain systems with an integral quadraticconstraint.IEEE Transactions on Automatic Control,40(6):1080–1083, 1995.

[2] I.R Petersen and A.V Savkin.Robust Kalman Filter-ing for Signals and Systems with Large Uncertainities.Birkhauser, Boston, 1999.

Phil Dauwalder
258
Page 7: Sensor Fusion Based Missile Guidance

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

5

10

15

20

25

30

35

40

Maneuver frequency(rad/sec)

Mis

s di

stan

ce

Video sensing Video and Radar sensing

Figure7.1: Missile guided with video imagery and videoimagery supported by ground radar

0 200 400 600 800 1000 1200 1400

02000

40006000

80000

0.5

1

1.5

2

2.5

3

x 104

X directionY direction

Z d

irect

ion

UAV

UAV

Missile

Target

Figure7.2: Trajectory of missile, target and the UAVs forw = 0.5rad/sec

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

5

10

15

20

25

30

35

40

Target Maneuver frequency (rad/sec)

Mis

s di

stan

ce

Single missileMissile with UAVs

Figure7.3: Missile guided by singe video camera againstvideo from multiple imaging from unmanned ariel vehi-cles(UAVs)

[3] E.J Hughes. Evolutionay guidance for multiple mis-siles. In Proceedings of the 15th Triennial WorldCongress of the International Federation of AutomaticControl(IFAC), Barcelona, Spain, December 2002.

[4] A.V Savkin, P.N Pathirana, and F.A Faruqi. The prob-lem of precision missile guidance: LQR and H∞ con-trol frameworks. InProceedings of the40th IEEEConference on Decision and Control, number 2, pages1535–1540, Florida, USA, December 2001.

[5] T.A Murat. Digital video processing. Prentice HallPTR, Upper Saddle River, NJ, 1995.

[6] B.K.P Horn.Robot Vision. MIT Press, Massachusetts,Cambridge, 1986.

[7] J Weber and J Malik. Robust computation of opticalflow in a multi-scale differential framework.Interna-tional Journal of Computer Vision, 14:67–81, 1995.

[8] B.K.P Horn and B.G Schunck. Determining opticalflow. Artificial Intelligence, 17:185–203, 1981.

[9] B Lucas and T Kanade. An iterative image regis-tration technique with an application to stereo vision.pages 121–130. DARPA Image understanding Work-shop, DARPA, 1981.

[10] Ciann-Dong Yang and Chi-Ching Yang. Analyti-cal solution to true proportional navigation.IEEETransactions on Aerospace and Electronic Systems,32(4):1509–1522, October 1996.

[11] Y Kim and J.H Seo. The realization of thethree dimensional guidance law using modified aug-mented proportional navigation. pages 2707–2712,

Phil Dauwalder
259
Page 8: Sensor Fusion Based Missile Guidance

Kobe,Japan, December 1996. IEEE Conference onDecision and Control.

[12] B.S Kim, J.G Lee, and H.S Han. Biased png forimpact with angular constraint.IEEE Transactionson Aerospace and Electronic Systems, 34(1):277–288,January 1998.

[13] E Duflos, P Penel, and P Vanheeghe. 3d guidancelaw modeling. IEEE Transactions on Aerospace andElectronic Systems, 35(1):72–83, January 1999.

[14] Ciann-Dong Yang and Chi-Ching Yang. A unifiedapproach to proportional navigation.IEEE Transac-tions on Aerospace and Electronic Systems, 33(2,PartI):557–567, April 1997.

[15] C.F. Lin. Modern Navigation, Guidance and ControlProcessing - Vol. II. Prentice Hall, Englewood Cliffs,NJ, 1991.

[16] M.R James and I.R Petersen. Nonlinear state esti-mation for uncertain systems with an integral con-straint. IEEE Transactions on Signal Processing,46(11):2926–2937, November 1998.

[17] A.V Savkin and I.R Petersen. A connection betweenH∞ control and the absolute stabilizability of uncer-tain systems.Systems and Control Letters, 23(3):197–203, 1994.

[18] A.V Savkin and I.R Petersen. Nonlinear versus lin-ear control in the absolute stabilizability of uncertainlinear systems with an integral quadratic constraint.IEEE Transactions on Automatic Control, 40(1):122–127, 1995.

[19] A.V Savkin and R.J Evans. Hybrid dynamicalsystems Controller and Sensor Switching Problems.Birkhauser, Boston, 2002.

Phil Dauwalder
260