real-time implementation of airborne...

17
Real-Time Implementation of Airborne Inertial-SLAM Jonghyuk Kim a and Salah Sukkarieh b a College of Engineering and Computer Science The Australian National University, ACT 0200, Australia b ARC Center of Excellence for Autonomous Systems The University of Sydney, NSW 2006, Australia Abstract This paper addresses the challenges in real-time implementation of Simultaneous Localisation and Mapping (SLAM) on a UAV platform. When compared to the implementation of SLAM in 2D environments, airborne implementation imposes several difficulties in terms of computational complexity, difficulty in loop closure, and high nonlinearity in both vehicle dynamics and observations. Airborne SLAM is formulated to relieve the computational complexity in both direct and indirect ways. The implementation is based on an Extended Kalman filter which fuses data from an Inertial Measurement Unit (IMU) and a passive vision system. Real-time results from flight trials are provided. Key words: Airborne SLAM, Inertial Measurement Unit (IMU), Vision, UAV 1 Introduction Unmanned Aerial Vehicles (UAVs) have attracted much attention from robotic researchers in both civilian and defence industries over the past few years. They can perform various tasks in highly dangerous environments where ac- cess from human operators or ground vehicles are limited. There is a broad spectrum of applications ranging from academic research, resource monitoring, search and rescue, bush fire monitoring and information gathering. Advances in cost effective inertial sensors and accurate navigation aids such as the Global Navigation Satellite System (GNSS) were key determinants of the feasibility of UAV systems. By fusing information from inertial and GNSS, the 6DoF vehicle state can be reconstructed, which is a crucial step for guidance and flight control [1][2]. Preprint submitted to Elsevier Science 11 July 2006

Upload: others

Post on 04-Oct-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

Real-Time Implementation of Airborne

Inertial-SLAM

Jonghyuk Kim a and Salah Sukkarieh b

aCollege of Engineering and Computer Science

The Australian National University, ACT 0200, Australia

bARC Center of Excellence for Autonomous Systems

The University of Sydney, NSW 2006, Australia

Abstract

This paper addresses the challenges in real-time implementation of SimultaneousLocalisation and Mapping (SLAM) on a UAV platform. When compared to theimplementation of SLAM in 2D environments, airborne implementation imposesseveral difficulties in terms of computational complexity, difficulty in loop closure,and high nonlinearity in both vehicle dynamics and observations. Airborne SLAMis formulated to relieve the computational complexity in both direct and indirect

ways. The implementation is based on an Extended Kalman filter which fuses datafrom an Inertial Measurement Unit (IMU) and a passive vision system. Real-timeresults from flight trials are provided.

Key words: Airborne SLAM, Inertial Measurement Unit (IMU), Vision, UAV

1 Introduction

Unmanned Aerial Vehicles (UAVs) have attracted much attention from roboticresearchers in both civilian and defence industries over the past few years.They can perform various tasks in highly dangerous environments where ac-cess from human operators or ground vehicles are limited. There is a broadspectrum of applications ranging from academic research, resource monitoring,search and rescue, bush fire monitoring and information gathering. Advancesin cost effective inertial sensors and accurate navigation aids such as the GlobalNavigation Satellite System (GNSS) were key determinants of the feasibilityof UAV systems. By fusing information from inertial and GNSS, the 6DoFvehicle state can be reconstructed, which is a crucial step for guidance andflight control [1][2].

Preprint submitted to Elsevier Science 11 July 2006

Page 2: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

In many robotic applications however, the vehicle needs to perform a taskwithin environments where GNSS information is not available such as indoor,in a forest, underground, or in general where GNSS is denied. In such casesautonomous localisation is required.

Autonomous localisation is a process of determining the platform’s pose with-out the use of any a priori information external to the platform except forwhat the platform senses about the environment; that is, the determinationof the platform’s position and attitude without the use of predefined maps orany purpose built infrastructure. This is known as Simultaneous Localisationand Mapping (SLAM) introduced by [3], where the vehicle has a capabilityof online map building, while simultaneously utilising the generated map toestimate the errors in the navigation solution.

There have been significant advances in SLAM research over recent years interms of real-time deployment and implementation on land and underwaterapplications. Most efforts have been made on reducing the computational com-plexity of SLAM. For example, large-scale maps are partitioned into smallamenable maps in [4] and [5], and [6] introduced the hierarchical sub-mapmethod. The sparse nature of the information filter has also been extensivelyinvestigated and implemented [7]. In parallel to these efforts, there have beenattempts to develop SLAM for 3D environments: the use of rotating laserrange finders in mining applications for example [8] and the use of stereo vi-sion systems for low-dynamic aerial vehicles [9]. However in these applicationsthe 3D implementation is limited to low-dynamic vehicles due to the extensiveprocessing in 3D mapping.

For airborne applications, to the best of our knowledge there have been onlythree attempts upto now: SLAM on a blimp-type (thus low-dynamic) platformusing a stereo vision system [9]; inertial SLAM in a laboratory environment[10]; and on a fixed-wing UAV with inertial sensors and a single vision systemby the authors [11].

In this paper, we will provide a real-time implementation of airborne SLAMas an extension of the previous work. The challenge in airborne deploymentof SLAM lies in the complexity of the dead-reckoning process and its fastdrifting error. If we look at how the localisation system for an airborne vehiclehas been formulated in the past, the core sensing device has been an InertialMeasurement Unit (IMU). This unit measures acceleration and rotation rate ofa platform at high update rates, which can then be transformed and processedto provide the position, velocity and attitude of a platform, resulting in anInertial Navigation System (INS) [12][13]. The data presented by the INS isfed to the guidance and control system to achieve further autonomy. Inertialnavigation is significant in that it only measures dynamical quantity and thusis independent of the platform kinematics. The navigational solution of INS,

2

Page 3: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

Range/

Bearing

Sensor

SLAM

EKF Angular Rate

Acceleration

Range

Bearing

Elevation

Inertial

Sensors

Position

Velocity

Attitude

Map

(a)

Inertial

Sensors

Inertial

Navigation

Range/

Bearing

Sensors

SLAM

KF

Angular Rate

Acceleration

Range

Bearing

Elevation

Position, Velocity, Attitude

Errors

Error

Observation

Position, Velocity, Attitude

Map Errors in Map

Range, Bearing, Elevation

(b)

Fig. 1. (a) The direct 6DoF SLAM structure which estimates the vehicle position,velocity and attitude along with the feature locations and (b) the indirect 6DoFSLAM structure which uses the error state of the INS and map model and estimatesthe errors in the vehicle states and map.

however, drifts with time as like most other dead-reckoning systems. However,the drift rate of the inertial position is typically a cubic function of time whichmakes the development of inertial based SLAM a challenge. Even small errorsin gyros will be accumulated in angle estimates (roll and pitch), which inturn misrepresent the gravitational acceleration as the true acceleration, thusresulting in quadratic velocity and cubic position errors. Therefore the INSrequires reliable and effective aiding information to constrain the errors. Inthis paper, we will provide a real-time result of airborne SLAM based on anExtended Kalman filter which fuses information obtained from a vision systemwith the information from the INS.

In section 2, we will present the airborne SLAM algorithm based on directand indirect approaches and discuss the benefits of both. Sections 3 and 4 willprovide details of the real-time SLAM implementation and flight test results.Section 5 provides a conclusion with direction for future work.

2 Airborne SLAM Formulation

SLAM has been formulated directly for nonlinear dynamic and observationmodels and by using an Extended Kalman Filter (EKF). In inertial navigationapplications however, SLAM can also be formulated indirectly for linearisederror dynamic/observation models and by using a linear Kalman Filter (KF).This indirect formulation has several benefits over the direct formulation. Fig-ure 1 compares these two SLAM structures. In both cases, an IMU providesthe acceleration and angular velocity of the vehicle. The observation sensorprovides the range, bearing and elevation of the features. In direct form, thefilter accepts the raw data from the IMU and passes this into a nonlinear 6DoFmodel, and the EKF proceeds through the process of predicting and updatingthe states of the vehicle and feature locations.

3

Page 4: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

In an indirect implementation however, the inertial loop is separated fromthe filter, thus the inertial navigation equations transform the raw inertialdata to position, velocity and attitude outside of the filter with sufficientlyhigh rates. The state dynamic model in KF is an error model of both thevehicle and the features. When an observation occurs, a predicted observationis generated, which is based on the current location of the vehicle and locationof the feature as dictated by the map. The difference between the predictedand actual observation is passed to the KF as an observed error. The KF usesthis to estimate the inertial and feature errors. The estimated errors from thefilter are then fed back to the INS and map for further correction.

Although the heart of the SLAM algorithm is exactly the same, the mainbenefits of this structure can be summarised as follows [14]:

• The system becomes more tractable for real-time processing. The main INSloop can provide continuous navigation data within fixed time intervals.The SLAM update cycle, whose computation time increases according tothe map size, will not disrupt the main INS loop and time propagationalgorithms can be used to match information at appropriate times.

• The SLAM prediction cycle can exploit the low-dynamic characteristics ofthe INS errors. As a result, the rate of prediction cycle can be much lowerthan that of the direct filter. The more accurate the IMU, the less frequentthe prediction cycle has to run.

• The observability of the system can be analytically investigated by usingthe time-variant linear SLAM model. The observability Grammian matrixand the rank of the system, can thus be evaluated.

The significant benefit of indirect SLAM is its real time performance if this isrequired. At the time of the publication, the direct method was implementedas it worked sufficiently for our purposes, and also since the indirect formhad not been implemented yet because of flight trial constraints. Both SLAMmodels are presented in the following sections.

2.1 Direct Formulation

The direct SLAM filter consists of a 6DoF INS and map model with a nonlinearfeature observation model. By assuming a local tangent inertial frame, theSLAM equations in continuous time can be written as,

4

Page 5: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

pn

vn

qn

mn

1···N

=

vn

Cn

bf b + gn

1

2qn ⊗ ωb

01···N

, (1)

where Cn

bis a directional cosine matrix (DCM) which transforms the specific

force vector measured in the body frame to the navigation frame, ⊗ representsthe quaternion product, and ωb is the quaternion form of the angular velocitymeasured from the gyros, and N is the total number of features.

The onboard terrain sensor measures the range, bearing and elevation andthus provides the relative position vector between the vehicle and feature.The relationship between the observation in the sensor frame and the loca-tion of the feature in the navigation frame determines the derivation of theobservation model. The ith feature position, mn

iin the navigation frame, is a

function of the vehicle position pn, the sensor lever-arm offset from the bodycentre rb

bs, and the relative position of the feature measured from the sensor

location rs

smin the sensor frame [11]:

mn

i=pn + Cn

brb

bs+ Cn

bCb

srs

sm. (2)

Here Cb

sis a directional cosine matrix which transforms the vector in the sensor

frame to the body frame and is defined for each payload sensor installment. Avision camera installed pointing downward is used, hence its frame is definedby rotating the body axes -90◦ along the pitch axis. The relative position vectorof the map from the sensor rs

sm= [x y z ]T is computed from the range, bearing

and elevation using a polar to cartesian transformation. The observation modelis then obtained by rearranging Equation 2 and by representing the range,bearing and elevation with respect to the vehicle state and map.

2.2 Indirect Formulation

As mentioned previously, the indirect SLAM filter estimates the errors inthe INS and the map. The error model can be developed by perturbing thenonlinear equation and can be written as:

5

Page 6: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

δpn

δvn

δψn

δmn

1···N

=

δvn

fn × δψn + Cn

bδf b

−Cn

bδωb

01···N

, (3)

with δψn being the misalignment angles, δf b being the acceleration error inthe body frame and δωb being the gyro error in the body frame.

To obtain a linearised observation model the error in an initialised featureposition needs to be analysed first. The feature error is defined as the differencebetween the predicted position mn

iand true position mn

i:

δmn

i, mn

i− mn

i(4)

Perturbing Equation 2 with the errors and subtracting it from the true featureposition gives,

δmn

i= (pn − pn) + (Cn

brb

bs− Cn

brb

bs) + (Cn

bCb

srs

sm− Cn

bCb

srs

sm). (5)

Using the predicted DCM as the true DCM misaligned by errors:

Cn

b, (I − [δψn×])Cn

b, Cb

s, (I − [δθb

s×])Cb

s, (6)

where [δψn×] and [δθb

s×] are the INS attitude error and sensor misalignment

error expressed in a skew-symmetric form, respectively. Using these, Equation5 can now be rewritten as,

δmn

i= δpn + Cn

bδrb

bs+ Cn

b[rb

bs×]δψn + Cn

bCb

sδrs

sm+

Cn

bCb

s[rs

sm×](δψn + δθb

s). (7)

The sensor lever-arm error and misalignment error can be calibrated precisely,hence by ignoring them and using a transformed notation for the relative rangevector, Equation 7 can be simplified as,

δmn

i= δpn + [rn

sm×]δψn + δrn

sm. (8)

From this analysis, it is obvious that the position error of the vehicle is di-rectly reflected onto the initialised feature position error, which establishes thecorrelation structure between the vehicle and features. The actual observation

6

Page 7: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

to the filter input is the last error term in Equation 8. Hence rearranging thisequation gives the observation model:

δzn

i= δmn

i− δpn − [rn

sm×]δψn. (9)

2.3 Prediction and Update

With the state transition and observation models defined, the prediction andestimation procedure can progress. If the direct filter is used, then the non-linear state model is used to propagate the vehicle state and the nonlinearobservation model is used to estimate the vehicle and map state from thefeature observations. The error covariance is propagated and updated by eval-uating the Jacobian matrix for nonlinear models. If the indirect filter is usedthen the filter predicts only the error covariance since the predicted errorsbecome zero with the assumption of zero-mean Gaussian noise. If the fea-ture observations are available, then the error states are estimated in additionto the error covariance matrix. The estimated errors are then applied to theexternal INS loop and map database for the correction.

Whenever a feature is observed, data association is conducted, which checksto see if the feature has been previously observed. If the feature has beenpreviously registered in the map, the observation is used to update the stateand covariance, and if the feature is a new one then a new feature error state isaugmented to the filter state. The data association can be performed by usinga simple nearest neighbour method, or by using more robust methods such asthe joint compatibility bound and branch. In this work, the nearest neighbourmethod is used for its simplicity and for real-time implementation. To avoidany possible ambiguity during flight, the ground features are installed withsufficient separation distances.

Indirect inertial SLAM was implemented and compared to the direct formusing a high fidelity UAV simulator [14], showing the improved performancein terms of the computational complexity and flexibility in filter configuration.In real-time operations, however, only the direct filter was implemented andverified due to the limited schedule of flight trials, and its results are presentedin Section 4.

3 Real-time Implementation

The airborne SLAM algorithm is implemented on the UAV platform, theBrumby MkIII, which was designed and built at the University of Sydney.

7

Page 8: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

(a)

(b) (c)

Fig. 2. a) Two Brumby MkIII airframes on the runway at the flight test facility, b)IMU from Inertial Science is used to provide 6DoF vehicle information, and c) asmall Sony camera is used as the vision system.

3.1 Flight Vehicle and Sensor Systems

The Brumby airframes shown in Figure 2(a) are capable of flying at approxi-mately 50m/s and have an endurance in the order of 45 minutes with a 20kgpayload. Each UAV platform is equipped with various kinds of sensors. Allsensors are connected to the Flight Control Computer (FCC) via serial linkor local bus system. The IMU from Inertial Science is used (shown in Figure2(b)). It is small in size and lightweight with a low power consumption, whichis ideal for UAV applications. It is classified as a low-grade tactical IMU. Itoutputs acceleration and rotation rate at the rate of 400Hz to the FCC andthe SLAM computer through a RS-422 serial link during flight operations. Anonboard vision system provides feature observations to the onboard SLAMnode. The vision system makes use of a low cost monochrome camera from

8

Page 9: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

(a)

IMU Sensor

Acquire IMU data Packet

INS/SLAM Navigation

Vision/SIS/Radar Node

Data

Logger

Display

/Downlink

FCC Node

CommlibX

Agent

- Vision packet

- FCC command

- Status

Synch

Time

Sync

Pulse

Observation/

Commands

Inertial Pos/Vel/Att/Map

Time Packet

Pos/Vel/Att/Map

Status

(b)

Fig. 3. a) The SLAM processor is installed in the front nose pointing downward andb) the real-time SLAM multi-threaded structure.

Sony, as shown in Figure 2(c). This is a charge-coupled device which has aresolution of 600 horizontal lines. It has a composite video output, which givesimages at 25Hz or 50Hz, depending on the interlaced mode. The vision sensoris mounted, pointing down, in the second payload bay of the Brumby MkIII,immediately behind the forward bulkhead. Artificial features were placed onthe ground; white plastic sheets, which were easier for the vision system toidentify, were used. A simple intensity threshold algorithm is used to extractthese features from the images in real-time. The bearing to the feature can

9

Page 10: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

Table 1EKF-SLAM filter Q/R Tuning Parameters (1σ).

Param Type Unit Value

Q Pos noises (X/Y/Z) m 0.5

Accel noises (X/Y/Z) m/s2/√

Hz 0.1

Gyro noises (R/P/Y) rad/s/√

Hz 0.001

R Range noise m ≥20

Bearing/Elevation noises ◦ 0.1604, 0.1206

then be generated. Although the vision sensor does not give range directly,an estimated value is generated based on the size of the features. The SLAMprocessing computer is installed separately inside the front nose and is shownin Figure 3(a). The FCC and SLAM computer are connected via local busand all the information required for SLAM operation is passed on through thebus.

3.2 Real-time Software Development

The SLAM algorithm is implemented by using the C++ class frameworkwhich provides flexibility and modularity. By using WindowsTM developmenttools, the algorithm can be easily debugged and verified, reducing the overalldevelopment time. Then the developed C++ class modules are ported intoa multi-threaded real-time SLAM software under the QNXTM environment.The thread inter-connection and the data flow within the SLAM computerare shown in Figure 3(b). As the direct SLAM algorithm is implemented, thefilter thread processes the IMU data to predict the vehicle and map state,as well as to propagate the vehicle covariance and cross-correlation betweenvehicle and map. Whenever an observation arrives, the filter thread performsthe SLAM update. The display thread fetches the SLAM results and sendsthem to the ground station and the mission computer. The interrupt handlerwakes up with the arrival of a synchronisation pulse from the FCC with thecorresponding timing message from the network.

Table 1 presents the filter Q/R tuning parameters used in real-time operation.The range uncertainty was actually estimated within the vision processingnode based on the known feature size. The nominal accuracy of the range,during level flight with 100m of altitude above ground, was greater than 20m.The filter initialisation was performed on the fly by utilising the informationreceived from GNSS/Inertial system.

In real-time operation, the IMU and vision data are processed in real-time

10

Page 11: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

Table 2SLAM initial parameters and associated uncertainties (1σ) transferred from theon-board FCC system (UTM coordinates are used).

State X Y Z X(1σ) Y(1σ) Z(1σ)

Position(m) 6167423.13 230258.39 −769.60 2.0871 2.1176 3.7197

Velocity(m/s) 31.16 15.50 −3.93 1.2488 1.1590 1.3404

Attitude(◦) −11.51 12.15 6.52 1.2194 1.3126 1.4576

or near real-time processing methods depending on the latency in the SLAMfilter update. That is, SLAM prediction and update are performed in real-time with a low number of features, but if the latency of the filter updatebecomes significant, the accumulated inertial/vision data are processed as ablock. Another concern for real-time SLAM was the ability of reliable loop-closure. Although the ground features were installed with enough separation(>50m) to relieve the on-line data association problem, it was observed theloop closure was still unreliable due to the large drift errors in INS solution.Hence the data association software was modified to exploit the on-boardnavigation solution during loop-closure.

4 Real-time Results

Tests were performed at Marulan, the University of Sydney’s flight test facil-ity. A typical flight test time was around 30 minutes and the nominal flightheight was 100m above ground. The maximum ground speed was approxi-mately 40m/s. In a subsequent flight trial SLAM was tested as a real-timenavigation system. SLAM was activated during flight when the vision cam-era had a clear view of features on the ground and when a designated heightabove ground was reached. The SLAM initialisation required initial position,velocity and attitude information along with the uncertainty in these val-ues. Table 2 shows these parameters which were transferred from the onboardGNSS/Inertial navigation system before SLAM was activated. The IMU cali-bration data was also transferred via the vehicle bus.

The SLAM estimated vehicle and map position in real-time are shown in Fig-ure 4 with the GNSS/Inertial solution for comparison. Both the estimated (‘+’marks) and surveyed (‘x’ marks) map positions are also plotted. After take-off,the vehicle underwent autonomous flight in a oval trajectory and then SLAMwas activated from the ground station. The SLAM system incrementally builta map and used it to estimate the inertial errors simultaneously. Figure 5presents detailed successive 2D plots of SLAM during the first round untilthe vehicle closes the loop, which shows the navigation system incrementally

11

Page 12: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

Fig. 4. The vehicle and feature positions estimated from the real-time SLAM system.The GNSS/Inertial position and surveyed feature positions are plotted for compar-ison, showing the correspondences with the ground truths. 1σ ellipsoidal boundswere used for the feature uncertainty.

building a map and navigating as the vehicle traverses the flight path. Duringloop closure, it can be seen that both vehicle and map accuracies improve.Successive re-observation of the features continues to improve the accuracyof the navigation and mapping solution. Figure 6 also shows the results fromsubsequent passings. The map uncertainty decreases monotonically towardsthe lower limit and it became less sensitive to the addition of further infor-mation gain. This clearly illustrates the success of the implementation and itsusefulness in airborne applications.

The difference between the SLAM estimated vehicle position and the referenceGNSS/Inertial solution was partly due to the errors in the initial parametersand the height deviation. The more significant source of errors was the rangecalculation in undulated 3D terrain environments. The non-parallel relation-ship between the plane of the feature and the camera image plane makesthe vehicle observe the feature at some angle. Hence, the detected feature di-mension is always smaller than its actual value, resulting in a greater rangeestimate. Since the camera detects the new feature in the image plane, thiscan either push-forward the feature position or push-back the vehicle posi-tion/velocity along the longitudinal direction. This effect can be seen moreclearly in Figure 5(d), where the increased vehicle uncertainty, after the firstcornering without features, caused a noticeable backward correction. Theseerrors can be overcome by using a more accurate mission sensor and a proper

12

Page 13: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

x 106

East (m)

Nor

th (

m)

1

(a)

2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

x 106

East (m)

Nor

th (

m)

1

2

3

4

56

(b)

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

(c)

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

7

89

10

11

12

1314

1516

1718

19

(d)

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

7

89

10

11

12

1314

1516

1718

1920

(e)

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

7

89

10

11

12

13

14

1516

1718

1920 21

22

23

24

25

(f)

Fig. 5. The detailed views of real-time SLAM during the first round. (a) The SLAMsystem was activated during the flight. (b), (c) and (d) present the incrementalbuilding up of the map which is to estimate inertial errors simultaneously (5σ ellipseswere plotted for the feature). (e) shows the vehicle approaching the initial positionand in (f) the loop is closed by observing previous features. Both map and vehicleaccuracies improved.

13

Page 14: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

7

89

10

11

12

13

14

1516

1718

1920 21

22

23

24

2526

27

(a)

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

7

89

10

11

12

13

14

1516

1718

1920 21

22

23

24

2526

27

28

(b)

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

7

89

10

1112

13

14

1516

1718

1920 21

22

23

24

2526

27

28

29

(c)

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

7

89

10

1112

13

14

1516

1718

1920 21

2223

24

2526

27

28

29

(d)

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

7

89

10

1112

13

14

1516

1718

1920 21

2223

24

2526

27

28

29

30

31

32

33

(e)

2.298 2.299 2.3 2.301 2.302 2.303 2.304 2.305 2.306 2.307

x 105

6.1673

6.1674

6.1675

6.1676

6.1677

6.1678

6.1679

6.168

6.1681

x 106

East (m)

Nor

th (

m)

1

2

34

56

7

89

10

1112

13

14

1516

1718

1920 21

2223

24

2526

27

28

29

30

31

32

33

(f)

Fig. 6. Detailed views of real-time SLAM during the successive loops. The plotsshow that the uncertainties of the vehicle and map further decrease monotonicallydue to the successive re-observations and loop closures.

ranging device.

In Figure 7, the position, velocity and attitude estimates from SLAM are com-pared with GNSS/Inertial solutions, and the SLAM estimated 1σ covariances

14

Page 15: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

0 100 200 300 400 500−500

0

500

1000Position Comparison

Nor

th(m

) SLAMGNSS/Inertial

0 100 200 300 400 500−500

0

500

Eas

t(m

)

0 100 200 300 400 500−850

−800

−750

−700

Dow

n(m

)

Time (second)

(a)

0 100 200 300 400 5000

5

10

15

20

25

Time (second)

Pos

ition

Cov

aria

nce

(1\{

sigm

a})(

m)

NorthEastDown

(b)

0 100 200 300 400 500−50

0

50Velocity Result

Nor

th(m

/s) SLAM

GNSS/Inertial

0 100 200 300 400 500−50

0

50

Eas

t(m

/s)

0 100 200 300 400 500−20

0

20

Dow

n(m

/s)

Time (second)

(c)

0 100 200 300 400 5000

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2

Time (second)

Vel

ocity

Cov

aria

nce(

1\{s

igm

a})(

m/s

)

NorthEastDown

(d)

0 100 200 300 400 500

−50

0

50Attitude Result

Rol

l(deg

) SLAMGNSS/Inertial

0 100 200 300 400 500

0

10

20

Pitc

h(de

g)

0 100 200 300 400 500−200

0

200

Yaw

(deg

)

Time (second)

(e)

0 100 200 300 400 5000

0.5

1

1.5

Time (second)

Atti

tude

Cov

aria

nce(

1\{s

igm

a})(

deg)

RollPitchYaw

(f)

Fig. 7. (a) Comparisons of positions between SLAM (blue solid line) andGNSS/Inertial (red dash-dot line) and (b) the SLAM estimated position covari-ances (1σ) in real-time. (c) compares velocities with SLAM estimated covariances(d), and (e) compares attitudes with covariances (f). Periodic reductions in covari-ances can be observed which result from the SLAM loop closures.

are plotted. The large errors in position estimates are due to the the low qual-ity inertial system. The velocity and attitude results are more promising andcomparable to those of GNSS/Inertial, showing the ability of the SLAM fil-ter to stabilise INS. Table 3 presents standard deviations of the SLAM errors

15

Page 16: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

Table 3Standard deviations of SLAM errors computed by subtracting SLAM estimates fromGNSS/Inertial solutions.

Errors X(1σ) Y(1σ) Z(1σ)

Position (m) 64.44 60.93 15.59

Velocity (m/s) 5.01 4.46 1.49

Attitude (◦) 1.35 1.15 1.89

Map (m) 19.45 31.98 30.29

which are obtained by subtracting the results from the GNSS/Inertial outputsand ground surveyed map positions. By comparing these values with the es-timated covariances (plotted in Figure 7), it can be observed that the SLAMfilter maintained over-confident covariance estimates. This can be improvedby performing a more precise error analysis with the errors in inertial sensorsand initialisation. However, as the process noises increase, the vehicle states,particulary the attitude, becomes more sensitive to SLAM observation up-dates. This can be overcome either by partitioning the attitude update fromthe velocity/position ones, or by fusing a extra aiding source, such as an air-data system, to improve the INS performance between vision update and tomaintain the uncertainty within an acceptable level.

There are still a number of theoretical, technical, and practical issues whichneed to be resolved, including SLAM consistency, data synchronisation be-tween vision and INS, real-time implementation of the indirect filter, naturalfeature detection and representation, and the incorporation of sub-map tech-niques for a large scale deployment.

5 Conclusions

This paper presented the results of real-time implementation of airborne SLAMon a UAV platform by fusing inertial data with information from a vision sen-sor. Although airborne SLAM is still in its infancy with many exciting areasfor future research, the results presented here have clearly illustrated its ca-pability as a reliable and accurate airborne navigation and mapping system.SLAM consistency and robustness needs to be further investigated. It was ob-served that any inconsistent or overconfident attitude update would severelyaffect the SLAM solution. This problem is currently being addressed by de-coupling the attitude update from the position/velocity update or by fusingadditional velocity observation such as air velocity observation to make SLAMless susceptible and to enhance the robustness of airborne SLAM.

16

Page 17: Real-Time Implementation of Airborne Inertial-SLAMusers.cecs.anu.edu.au/~Jonghyuk.Kim/pdf/RAS-AirborneSLAM...to provide the position, velocity and attitude of a platform, resulting

References

[1] B. Parkinson, Origins, Evolution, and Future of Satellite Navigation, Journalof Guidance, Control, and Dynamics 20 (1) (1997) 11–25.

[2] J. Kim, S. Sukkarieh, S. Wishart, Real-time Navigation, Guidance and Controlof a UAV using Low-cost Sensors, in: International Conference of Field andService Robotics, Yamanashi, Japan, 2003, pp. 95–100.

[3] R. Smith, M. Self, P. Cheeseman, Estimating Uncertain Spatial Relationshipsin Robotics, Vol. 2, Elsevier Science, 1988.

[4] J. Guivant, E. Nebot, Optimization of the Simultaneous Localisation andMap building algorithm for Real-time Implementation, IEEE Transactions onRobotics and Automation 17 (3) (2001) 242–257.

[5] S. Williams, M. Dissanayake, H. Durrant-Whyte, An Efficient approach tothe Simultaneous Localisation and Mapping Problem, in: IEEE InternationalConference on Robotics and Automation, Vol. 1, 2002, pp. 406–411.

[6] C. Estrada, J. Neira, J. Tardos, Hierarchical SLAM: Real-time accuratemapping of large environments, IEEE Transactions on Robotics 21 (4) (2005)588–596.

[7] S. Thrun, Y. Liu, D. Koller, A. Ng, Z. Ghahramani, H. DurrantWhyte,Simultaneous Localization and Mapping with Sparse Extended InformationFilters, International Journal of Robotics Research 23 (2004) 693–716.

[8] A. Nuchter, H. Surmann, K. Lingemann, J. Hertzberg, S. Thurn, 6D SLAM withan Application in Autonomous Mine Mapping, IEEE Transactions on Roboticsand Automation (2004) 1998–2003.

[9] I. Jung, S. Lacroix, High resolution terrain mapping using low altitude aerialstereo imagery, in: The Ninth IEEE International Conference on ComputerVision, 2003.

[10] J. Langelaan, S. Rock, Passive GPS-Free Navigation for Small UAVs, in: 2005IEEE Conference on Aerospace, 2005, pp. 1 – 9.

[11] J. Kim, S. Sukkarieh, Autonomous Airborne Navigation in Unknown TerrainEnvironments, IEEE Transactions on Aerospace and Electronic Systems 40 (3)(2004) 1031–1045.

[12] P. Savage, Strapdown Analytic: Part 1,2, Strapdown Associates, Inc., 2000.

[13] D. Titterton, Strapdown Inertial Navigation Technology, Peter Peregrinus, Ltd.,1997.

[14] J. Kim, S. Sukkarieh, Recasting SLAM - Towards Improving Efficiency andPlatform Independency, Robotics Research: the 11th International Symposium,Springer-Verlag, 2005.

17