development of an unmanned helicopter for …...airmule uav from urbanaero. it is able to transport...

25
Development of an Unmanned Helicopter for Vertical Replenishment Fei Wang * , Peidong Liu , Shiyu Zhao , Ben M. Chen § , Swee King Phang , Shupeng Lai || , Tao Pang ** , Biao Wang †† , Chenxiao Cai ‡‡ , Tong H. Lee §§ Unmanned Systems Research Group, National University of Singapore, E4A-03-04, 3 Engineering Drive 3, Singapore 117582, Singapore This paper presents an intelligent and robust guidance, navigation and control solution for a rotary-wing UAV to carry out an autonomous cargo transportation mission between two moving platforms. Dierent from the conventional GPS/INS-only navigation scheme, this solution also integrates sophisticated Lidar and vision systems capable of precisely locating cargo loading and unloading positions. Besides, another complementary GPS/INS system is set up on the moving platforms with communication to the unmanned helicopter so that the controlled UAV is able to follow the dynamic platforms with good tracking performance. The whole system has been successfully implemented, and with its superb performance the Unmanned Systems Research Group from the National University of Singapore won the rst place in the nal round of the rotary-wing category competition of the 2nd AVIC Cup International UAV Innovation Grand Prix 2013. Keywords : UAV vertical replenishment; guidance navigation and control; vision-based guidance. 1. Introduction Rotorcrafts are often used for cargo transportation and vertical replenishment between seaborne vessels (see Fig. 1). The conventional way involves a manned helicopter, which relies on skills and experiences from a well-trained human pilot. The recent advancement of unmanned aerial vehicles (UAVs) however has opened the possibility of using unmanned rotorcrafts for this kind of cargo transportation tasks, which can reduce both risk and cost to a large extent. One example of such UAV transportation application is AirMule UAV from UrbanAero. It is able to transport up to 500 kg of cargo to places as far as 50 km away and was used to transport cargo in Israel for military purposes [1]. In [2], an innovative control method has been proposed to solve the general UAV slung load problem. A group of researchers have also proposed the estimation of load position and velocity in such system [3]. The cargo transportation problem can also be solved by a rigid claw mechanism such as those appeared in [4, 5]. Besides, some researchers have also investigated the ability of load transporting with the collaboration of multiple UAVs [6, 7]. With this cooperative structure, the size and cost of each individual UAV can be reduced. However, when solving this UAV cargo transportation problem, most of the existing works assume that the loading and unloading positions are accurately known. This as- sumption is reasonable in a few occasions where the envi- ronment is fully in control, but may not be valid for the more general cases. To expand the horizon of applications a small-scale UAV can do, an intelligent navigation and guidance system which can provide high-quality measure- ments and guidance information for UAV automatic ight control needs to be developed. One elegant solution is to integrate a computer vision sub-system for target searching Received 14 May 2014; Revised 21 November 2014; Accepted 21 No- vember 2014; Published 27 January 2015. This paper was recommended for publication in its revised form by editorial board member, Chang Chen. Email Addresses: *[email protected], [email protected], [email protected] nion.ac.il, § [email protected], [email protected], k [email protected], **[email protected], †† [email protected], ‡‡ [email protected], §§ [email protected] Unmanned Systems, Vol. 3, No. 1 (2015) 6387 # . c World Scientic Publishing Company DOI: 10.1142/S2301385015500053 63 Un. Sys. 2015.03:63-87. Downloaded from www.worldscientific.com by NATIONAL UNIVERSITY OF SINGAPORE on 01/30/15. For personal use only.

Upload: others

Post on 07-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Development of an Unmanned Helicopterfor Vertical Replenishment

Fei Wang*, Peidong Liu†, Shiyu Zhao‡, Ben M. Chen§,Swee King Phang¶, Shupeng Lai||, Tao Pang**,Biao Wang††, Chenxiao Cai‡‡, Tong H. Lee§§

Unmanned Systems Research Group, National University of Singapore,E4A-03-04, 3 Engineering Drive 3, Singapore 117582, Singapore

This paper presents an intelligent and robust guidance, navigation and control solution for a rotary-wing UAV to carry out an autonomouscargo transportation mission between two moving platforms. Different from the conventional GPS/INS-only navigation scheme, thissolution also integrates sophisticated Lidar and vision systems capable of precisely locating cargo loading and unloading positions.Besides, another complementary GPS/INS system is set up on the moving platforms with communication to the unmanned helicopter sothat the controlled UAV is able to follow the dynamic platforms with good tracking performance. The whole system has been successfullyimplemented, and with its superb performance the Unmanned Systems Research Group from the National University of Singapore wonthe first place in the final round of the rotary-wing category competition of the 2nd AVIC Cup — International UAV Innovation GrandPrix 2013.

Keywords: UAV vertical replenishment; guidance navigation and control; vision-based guidance.

1. Introduction

Rotorcrafts are often used for cargo transportation andvertical replenishment between seaborne vessels (seeFig. 1). The conventional way involves a manned helicopter,which relies on skills and experiences from a well-trainedhuman pilot. The recent advancement of unmanned aerialvehicles (UAVs) however has opened the possibility of usingunmanned rotorcrafts for this kind of cargo transportationtasks, which can reduce both risk and cost to a large extent.

One example of such UAV transportation application isAirMule UAV from UrbanAero. It is able to transport up to500 kg of cargo to places as far as 50 km away and was usedto transport cargo in Israel for military purposes [1]. In [2],

an innovative control method has been proposed to solvethe general UAV slung load problem. A group of researchershave also proposed the estimation of load position andvelocity in such system [3]. The cargo transportationproblem can also be solved by a rigid claw mechanism suchas those appeared in [4, 5]. Besides, some researchers havealso investigated the ability of load transporting with thecollaboration of multiple UAVs [6, 7]. With this cooperativestructure, the size and cost of each individual UAV can bereduced.

However, when solving this UAV cargo transportationproblem, most of the existing works assume that the loadingand unloading positions are accurately known. This as-sumption is reasonable in a few occasions where the envi-ronment is fully in control, but may not be valid for themore general cases. To expand the horizon of applications asmall-scale UAV can do, an intelligent navigation andguidance system which can provide high-quality measure-ments and guidance information for UAV automatic flightcontrol needs to be developed. One elegant solution is tointegrate a computer vision sub-system for target searching

Received 14 May 2014; Revised 21 November 2014; Accepted 21 No-vember 2014; Published 27 January 2015. This paper was recommendedfor publication in its revised form by editorial board member, Chang Chen.Email Addresses: *[email protected], †[email protected], ‡[email protected], §[email protected], ¶[email protected], [email protected],**[email protected], ††[email protected], ‡‡[email protected],§§[email protected]

Unmanned Systems, Vol. 3, No. 1 (2015) 63–87#.c World Scientific Publishing CompanyDOI: 10.1142/S2301385015500053

63

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

and tracking. In fact, vision-based target detection and lo-calization have been investigated intensively. Some of themrely on visual targets with special shapes and features, suchas [8] in which range estimation has been carried out basedon specific geometric features including points, lines andcurves. Others target on more general objects such as ahelipad [9], a mobile ground vehicle [10, 11] or anotherUAV [12]. In addition, there is also a trend in integratingvisual information in feedback control for mobile robotautonomous grasping and manipulation [13].

Although abundant solutions in solving the aforemen-tioned individual problems have been proposed in litera-ture, there is little research result on combining themtogether and implementing a fully functional cargo trans-portation UAV with all the useful characteristics and capa-bilities. In this paper, we propose a comprehensive UAVcargo transportation system which incorporates a small-size single-rotor helicopter with onboard sensors and pro-cessors, an innovative cargo grabbing mechanism, a set ofUAV autonomous guidance, navigation and control (GNC)algorithms, and a cargo searching and localization visionsub-system.

The developed UAV system, named NUS2T-Lion, hastaken part in the 2nd AVIC Cup — International UAV In-novation Grand Prix (UAVGP), which was held in Beijing inSeptember 2013. In this competition, the rotary-wing UAVsfrom various participating teams are required to automat-ically transport cargos between two parallel moving ships.The cargos are in the form of buckets with handles and theyare initially placed within colored circles drawn on thesurface of the first ship. Circles with a different color aredrawn on the other ship, indicating the unloading positions.The ships are simulated by ground platforms moving onrailways (see Fig. 2). Figure 3 shows a snap shot of NUS2T-Lion carrying the cargo bucket in this Grand Prix.

The structure of this paper is as follows. Section 2 talksabout design and integration of the UAV hardware system.Secs. 3–5 expand the UAV control, navigation and guidancealgorithms respectively. Section 6 explains how to imple-ment the GNC algorithm by the onboard software with thecomplete mission logics. Section 7 provides the flight testdata to verify the fidelity and performance of the overallsystem. Concluding remarks are made in Sec. 8.

2. Hardware Configuration

The hardware configuration of NUS2T-Lion follows the ro-torcraft UAV structure proposed in [14]. As illustrated inFig. 4 in which each block represents an individual hard-ware device, the whole system is constituted by four mainparts, namely a bare rotorcraft platform, an onboard avionicsystem, a manual control system and a ground controlsystem (GCS). While the manual control system and the GCS

Fig. 2. Cargo platform in UAVGP.

Fig. 3. NUS2T-Lion transporting a bucket cargo.

Fig. 1. Rotorcraft vertical replenishment (U.S. Navy Photo: Use ofreleased U.S. Navy imagery does not constitute product or orga-nizational endorsement of any kind by the U.S. Navy).

64 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

are quite standard for all kinds of UAV systems, the choicesof the bare rotorcraft platform and its onboard avionicsystem are usually application dependent. For this case,they should be selected and integrated specifically forthe UAV cargo transportation task. It is believed thatby designing the hardware configuration effectively, diffi-culties for the later software algorithm development can beminimized.

2.1. Bare rotorcraft platform

The Thunder Tiger Raptor 90 SE Nitro radio-controlled (RC)helicopter is adopted as the bare rotorcraft platform in thiswork. It is a hobby-level single rotor helicopter originallydesigned for acrobatic flights. As compared with othercommercial off-the-shelf (COTS) RC rotorcrafts such asTurbulence D3 and Observer Twin, Raptor 90 SE provides areliable structural design and equivalent flight performance,at approximately half the price.

However, with the original Raptor 90's nitro engine andnitro fuel tank, the endurance of the UAV can barely reach8min with full load avionics. This is not sufficient forpractical applications. To overcome this limitation, theoriginal nitro engine is replaced by a gasoline counterpart,which is a product from Zenoah with model numberG270RC. With the more efficient gasoline engine, a full-tankRaptor 90 can fly up to 30min. This greatly widens therange of potential applications this UAV can do and it isespecially beneficial to the cargo transportation task.

Unfortunately, this endurance improvement comes withtwo trade-offs. First, the vibration of the whole platformintensifies due to the gasoline engine. Second, the ignitionmagnet inside Zenoah G270RC is so large that its magneticfield can badly affect the onboard sensors. To overcome thevibration issue, wire rope isolators are used to protect theonboard avionics and filter out unwanted high frequencynoises. The solution will be discussed in Sec. 2.3. For theproblem of magnetic interference, the final solution is to

replace the electro-magnetic ignition system inside the en-gine with a pure electric ignition system. With this modifi-cation, the onboard sensors, especially the magnetometer,all work in the way they originally should.

To cope with the cargo transportation task, there mustbe a loading mechanism integrated into the helicopterplatform. By comparing the solution of a rigid claw-likegrabbing mechanism and a long flexible rope hookingmechanism, the former is more precise in picking up thecargos, while the latter can avoid descending the UAV toolow to the ship surface where the aerodynamic ground ef-fect becomes significant.

In this work, an innovative design incorporating advan-tages from both sides has been proposed. The solution is aclaw-like grabbing mechanism with very long arms (seeFig. 5). With this design, the UAV can keep a safe distance tothe ship surface, and at the same time, grab and release thecargo in a precise and reliable way. Another highlight of thisdesign is its omnidirectional feature, meaning no matter inwhich direction the cargo handle is oriented, it is not nec-essary for the UAV to adjust its heading to align accordingly.This saves time and minimizes unnecessary UAV man-euvers.

In addition, this design features a self-locking mecha-nism commonly used in landing gears of hobby-grade fixed-wing planes. The mechanism is enclosed in the rectangularboxes as shown in Fig. 5 with each box supports one armand is powered by one servo motor. When the claw fullyopens or closes, there is a slider inside the box to lock theposition of the servo motor. In this way, the servo motorsconsume zero power while carrying a heavy cargo.

A load sensing mechanism which can differentiate asuccessful cargo loading from a failure is also installed. Thismechanism acts as a safeguard in cases where the UAVmakes a grasping action but the targeted cargo is not loadedsuccessfully. By knowing that the cargo loading is unsuc-cessful, the UAV can descend and try grasping the cargoagain. The detailed design is shown in Fig. 6, where fourlimit switches, which send out electrical signals when

Flight ControlComputer

VisionComputer

Naviga�onSensor

LaserScanner

ServoController

WirelessModem Camera Wireless

Modem

Manual Control Ground Control System

WirelessModem

WirelessModem

StorageCard 1

StorageCard 2 Ba�eries

RCRotorcra�

Servos

Avionic System

Fig. 4. Hardware configuration of NUS2T-Lion rotorcraft system.

Fig. 5. Grabbing mechanism in closed and open configurations.

Development of an Unmanned Helicopter 65

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

pushed down, are installed on the customized landing skid.The baseplate of the claw is rigidly attached to a hollowrectangular plate on its top. The rectangular plate is thenresting on the cross-over beams of the landing skid via foursprings. When the claw is loaded, the rectangular platecompresses the spring and trigger one or more of the limitswitches. When the claw is unloaded, the springs push upthe rectangular plate to release the limit switches.

2.2. Avionic system

To realize fully autonomous flight, an onboard avionic sys-tem with sensors, processors and other electronic boardshas to be designed. All components used on NUS2T-Lion arethe carefully chosen COTS products up to date. Figure 7gives a complete view of the onboard system with the key

components indicated. The details and usage of thesecomponents are explained as follows.

2.2.1. Onboard sensors

The IG-500N GPS/INS (GPS aided inertial navigation sys-tem) unit is chosen as the fundamental navigation sensorfor NUS2T-Lion. IG-500N is one of the world's smallest GPSenhanced attitude and heading reference system (AHRS)embedded with an extended Kalman filter (EKF). It includesa microelectromechanical systems (MEMS) based IMU, aGPS receiver and a barometer. It is able to provide preciseand drift-free 3D orientation and position even during ag-gressive maneuvers, updated at 100Hz. With its presence,the UAV's attitude, velocity and position can be consistentlyobtained, despite the fact that the position measurementfrom IG-500N alone is not accurate enough for the precisecargo loading and unloading task.

The second main sensor used onboard of NUS2T-Lion isthe mvBlueFOX camera from Matrix Vision. It is a compactindustrial CMOS camera, compatible to any computers withUSB ports. A superior image quality makes it suitable forboth indoor and outdoor applications. In addition, it incor-porates field-programmable gate array (FPGA), whichreduces the computer load to minimum during image pre-processing. The standard Hi-Speed USB interface guaran-tees an easy integration without any additional interfaceboard. In this specific cargo transportation application, it isthe main guidance sensor for locating the cargos and theirunloading points.

For cargo transportation applications, height measure-ment from GPS/INS or barometer may not be accurateenough for the UAV to pick up or drop the cargo appro-priately. The UAV may even crash onto the surface of thecargo platform because of inaccurate height measurement,resulting in catastrophic consequences. While vision sensoror 1-D laser range finder may accomplish the task, theformer can only be relied on when the visual target iswithin the field of view and the latter cannot handle groundsurfaces with scattered obstacles. To make the heightmeasurement accurate and consistent, a scanning laserrange finder is the best choice. The laser scanner code-named URG-30LX from Hokuyo is installed in the system. Ithas a maximum range of 30m with fine resolution of 50mmand it can scan its frontal 270� fan-shaped area with aresolution of 0.25�.

2.2.2. Onboard computers

There are two onboard computers in the avionic system;one for the implementation of guidance, navigation andcontrol algorithms, and the other more powerful one

Fig. 6. Landing gear with bucket grabbing and load sensingfunctions.

Vision computer

IMU

Laser scanner

Camera

Control computer

Fig. 7. Onboard avionic system of NUS2T-Lion.

66 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

dedicated for vision processing. With this dual-computerstructure, the vision algorithm can be implemented andtested separately at the development stage and it is veryconvenient to upgrade to a more powerful vision computerin future without modifying the control hardware andsoftware system. It also improves the reliability of theoverall system since this structure ensures control stabilityeven when the vision computer malfunctions or encountersrun-time errors. It happens more frequently on the visioncomputer compared to the control counterpart because thevision algorithm usually involves more sophisticated cal-culations and logics. If it ever happens, the UAV should stillfly safely with the control computer alone and there will beenough time for human pilot to take over and land the UAVsafely.

For the onboard control computer, it collects measure-ment data from various sensors, performs sensor filteringand fusion, executes flight control law, and outputs controlsignals to carry out the desired control actions. In addition,it is also responsible for communicating with the GCS aswell as data logging. Being a light-weight yet powerfulembedded computer for real-time tasks, the Gumstix OveroFire embedded computer is selected for this purpose. It hasa main processor running at 720MHz and a DSP copro-cessor. The main processor is an OMAP3530 ARM chip fromTexas Instruments and it is one of the fastest low-powerembedded processor as of writing. Moreover, it has built-inWi-Fi module which saves the weight of an additionalcommunication device.

For the onboard vision computer, it is mainly forimplementing image processing algorithms, including colorsegmentation, object identification, object tracking and lo-calization. Image processing tasks are usually computa-tionally intensive and hence require powerful processors torun the algorithms in real time. We have chosen the Mas-termind computer from Ascending Technologies. It has anIntel Core i7 processor but is still small and light enough tobe carried by NUS2T-Lion. It also has abundant communi-cation ports to interact with peripheral devices like USBcameras and Wi-Fi devices. One UART port is used tocommunicate with the flight control computer.

2.2.3. Servo controller

An 8-channel pulse-width modulation (PWM) servo con-troller, UAV100 from Pontech, is used to enable servocontrol by either an onboard computer via serial port (au-tomatic mode) or output from an RC receiver (manualmode). The switching between the two modes depends onthe state of an auxiliary channel from the RC transmitter.While the UAV maneuvers autonomously in the air, it isdesirable to have a failsafe feature to allow the ground pilotto take over control during emergencies. Besides, this servo

controller has the function of outputting quantitative servovalues. This makes collecting manual or autonomous con-trol data possible and it is a necessary requirement for UAVdynamic modeling and system identification.

2.2.4. Avionic hub

A customized printed circuit board (PCB) called LionHub(see Fig. 8) is developed as an expansion board to hostvarious hardware devices. It is an improved version of asimilar board introduced in [15]. The aforementioned IG-500N navigation sensor, the Gumstix Overo Fire computer,and the UAV100 servo control board can be physicallyinstalled on the slots of this PCB hub and connected to theonboard power regulator and other essential components.Besides the mounting slots, extra mounting holes on Lion-Hub are used to lock the installed modules to resist thevibration and shock generated in flight and landing. Withthe introduction of LionHub, manual wire wrap is mini-mized to improve the reliability and quality of the system. Aserial RS-232 to TTL level voltage converter is included inLionHub to connect the output of IG-500N to the UART portof Gumstix. Furthermore, to power up all the avionics, linearregulators designed in the avionic hub to convert a powerinput from a 4-cell Lithium-Polymer (LiPo) battery to 12 Vand 5V outputs with sufficient current delivering. The 12 Voutput port powers the Mastermind computer and the Lidarsensor, while the 5 V output port powers the Gumstixcomputer and other electronic boards.

2.3. System integration

After selecting and configuring the individual mechanicaland avionic components, all these hardware parts need to

Flight control computer Servo controllerNavigation sensor

Fig. 8. Control hub with all hardware components attached.

Development of an Unmanned Helicopter 67

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

be assembled to form a coherent UAV platform. To ac-complish this task, special attention needs to be paid in thelayout design of the overall onboard system and anti-vibration consideration.

2.3.1. Layout design

The first priority is to place the navigation sensor as close tothe center of gravity (CG) of the whole UAV platform aspossible to minimize the so-called lever effect, which causesbias to acceleration measurement when the UAV platformperforms rotational motion. Note that all the other elec-tronic boards on the LionHub will also be located near tothe CG position because they are rigidly linked to the IMU.Usually there is no problem to align the IMU so that itsplanar x- and y-axis position coincide with the UAV CG.However, since a minimum space between the helicopterbelly and the onboard system is needed for bumpingavoidance, compromise needs to be made in the vertical z-axis and software compensation can be implemented tominimize the measurement error caused by this verticaloffset. In order to have better signal reception, the GPSantenna is placed on the horizontal fin of the helicopter tail.Again, its 3D position offset to the IMU needs to be com-pensated.

The next priority goes to the camera sensor. By consid-ering the fact that the UAV usually flies forward to searchfor targets and hovers right above the cargo for loading andunloading, the best position to place the camera is at thenose of the helicopter. In addition, a controlled pan-tiltgimbal (see Fig. 9) is designed to host the camera sensor sothat it always looks vertically downwards despite the UAVrolling and pitching motions. Taking advantage of thecamera's wide viewing angle, even when the UAV descendsto the lowest altitude for cargo grabbing, the camera canstill see the cargo which should be right under the UAV CG.

In order to retain CG balancing, the cargo loadingmechanism needs to be installed precisely under the UAVCG. In this way, the UAV roll and pitch dynamics will notchange too much after the cargo is loaded, thus the same setof robust control law can be used. This design also makes

sure that controlling the UAV CG to the correct planar po-sition is equivalent to controlling the cargo loading mech-anism to the correct position so that a precise grabbingaction can take place.

The placement of the remaining onboard componentsare less restricted. The overall CG balancing can be achievedby adjusting their mounting positions. For our case, thelaser scanner is positioned at the back end of the onboardsystem, scanning downwards. The vision computer is put atthe frontal part to counter-balance the laser scanner and tomake wiring to the camera sensor shorter. The battery isslotted at a bottom middle position so that it adds onminimal moment of inertia to the whole UAV platform.

With the above layout design, the distribution of mass isbalanced, the control challenge caused by the cargo loadingis minimized, and all sensors are working properly. An al-uminium plate is used to mount all the onboard compo-nents and it sits on four wire rope isolators (see Fig. 10)which helps to solve the mechanical vibration problem.

2.3.2. Anti-vibration

Anti-vibration for the onboard avionics is one of the mostimportant considerations in hardware design. It canimprove the overall performance of the UAV system sig-nificantly by reducing wear and tear of the mechanical andelectrical connectors and attenuating unwanted noises athigh frequencies. Indeed, the replacing of nitro engine witha gasoline engine amplifies the vibration issue. The mainvibration sources on NUS2T-Lion are from its main rotorsand the engine. From a frequency analysis of the in-flightacceleration data logged while hovering (see Fig. 11), onecan see that the most significant high-frequency vibrationoccurs at 22Hz.

To attenuate noise at this specific frequency, the CR4-400 compact wire rope isolator from Enidine is used.According to the CR series manual provided by Enidine, the

Fig. 10. Anti-vibration using wire rope isolators.

Fig. 9. Camera pan-tilt mechanism.

68 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

best stiffness for the chosen isolator, Kv can be calculated as

Kv ¼ Wsð2�fi=3Þ2=g; ð1Þwhere Ws is the static load on every isolator, fi is the inputexcitation frequency needs to be attenuated, and g is thegravitational constant. For our case, about 2 kg of onboardload is shared by four isolators, which gives Ws ¼ 4:9. Bysubstituting also fi ¼ 22 and g ¼ 9:781 into (1), Kv can becalculated as 1.06 kN/m which is best matched by the vi-bration stiffness value obtained by CR4-400 mounted in a\45� Compression/Roll" mode. There are also the \PureCompression" and \Shear/Roll" mounting methods, but the\45� Compression/Roll" mode is the best for attenuatingvibration in all three axes. After the installation of wire ropeisolators, Fig. 12 shows the improved performance of ac-celeration measurement. As compared to the original graph,the higher frequency noises have been reduced by 10 timesor more.

When the hardware platform is ready, guidance, navi-gation and control algorithms need to be developed tomake the UAV autonomous and intelligent. Figure 13illustrates the overview of the GNC structure implementedin this work. The control block makes sure the UAV atti-tude is stable and can robustly track way-point references.The navigation block provides all the necessary mea-surements by fusing raw data from UAV onboard GPS/INSunit, the ship GPS/INS unit and the UAV onboard laserscanner. Last but not least, by implementing a vision-based target detection and tracking sub-system, theguidance block generates smooth reference trajectories forthe UAV to follow and do meaningful tasks. In the fol-lowing three sections, these three blocks will be explainedin the sequence of control, navigation and guidance. In

fact, they form the most significant and innovative con-tributions from this paper.

3. Modeling and Control

For all UAV related applications, stability of the controlledplatform is the most fundamental problem that needs to besolved first. Otherwise, there is no foundation for high-levelnavigation and guidance algorithms to be built upon. In thispaper, we decompose the UAV control problem into twolayers, namely the attitude stabilization layer and the po-sition tracking layer. The former involves the design of aninner-loop control law which makes sure the UAV roll, pitchand yaw dynamics are robustly stable. The latter positiontracking layer involves the design of an outer-loop controllaw which enables the UAV to track any smooth 3D trajec-tory references in a responsive and precise way. However,most advanced control design methods require an accuratedynamic model of the controlled UAV platform. Hence, thefollowing context will start from NUS2T-Lion's modelidentification, and then proceed to control structure for-mulation, inner-loop control law design, outer-loop controllaw design and the inner-loop command generator whichconnects the two layers in a reasonable way.

3.1. Flight dynamics modeling

The modeling of NUS2T-Lion follows a similar procedureintroduced in [14]. The model structure is shown in Fig. 14where forces and torques (Fb and Mb) generated by variousmechanical parts of the helicopter are fed into the six

0 5 10 15 20 250

0.5X acceleration FFT

|ax|

0 5 10 15 20 250

0.5Y acceleration FFT

|ay|

0 5 10 15 20 250

0.5Z acceleration FFT

Frequency (Hz)

|az|

Fig. 11. Frequency analysis of acceleration without isolators.

0 5 10 15 20 250

0.5X acceleration FFT

|ax|

0 5 10 15 20 250

0.5Y acceleration FFT

|ay|

0 5 10 15 20 250

0.5Z acceleration FFT

Frequency (Hz)

|az|

Fig. 12. Frequency analysis of acceleration with isolators.

Development of an Unmanned Helicopter 69

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

degree-of-freedoms (DOF) rigid-body dynamics followed bythe kinematics. There are four inputs to the whole heli-copter system, namely the collective pitch input �col, con-trolling the heave dynamics, the lateral input �lat , controllingthe rolling dynamics, the longitudinal input �lon, controlling

the pitching dynamics, and the pedal input �ped, controllingthe yawing dynamics. Some cross-couplings exist among thefour channels. The outputs of the nonlinear model can befound at the right side of Fig. 14, namely the UAV globalframe position Pn, body frame velocity Vb, attitude angles �

UAVdynamics

Inner-loopcontroller

Outer-loopcontroller

Commandgenerator

Referencegenerator

Ship GPS/INS

UAV GPS/INS

Laser scanner

Vision system

Kalmanfilter

+

-

1 Hz 50 Hz 50 Hz 50 Hz

5 Hz

10 Hz

Guidance Control

Naviga�on

50 Hz

50 Hz

Fig. 13. Overall structure of guidance, navigation and control.

Vertical fin

Tail rotor

Horizontal fin

Fuselage

Main rotor

6−DOF

Kinematics

Yaw rate

+

feedback

controller

Main rotorflapping

dynamics

rigid−body

dynamics

Forces and moments

δped

δlat

δlon

δ̄ped

mg

δcolωb

b/n

δped,int

Vb

Pn

φ, θ, ψ

as,b s

Va

Vwind

Fb

Mb

Fig. 14. Model structure of NUS2T-Lion.

70 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

(roll), � (pitch), (yaw) and their corresponding angularrates p, q, r. Besides, as and bs are the longitudinal andlateral flapping angles of the main rotor and �ped;int is anintermediate state variable in representing the yaw dy-namics of the UAV. These three state variables are notmeasurable. Va and Vwind represent air velocity and winddisturbance respectively.

The detailed nonlinear model formulation and parame-ter identification closely follow the procedures in [14], thuswill not be repeated here. Some of the model parameterscan be simply measured, such as the dimensions, mass,moment of inertia of the UAV platform, while the remain-ings need to be identified by carrying out test-benchexperiments or actual flights. In identifying the model

parameters of NUS2T-Lion, a MATLAB-based software,Comprehensive Identification from FrEquency Response(CIFER) was used. CIFER is developed by the NASA AmesResearch Center for military-based rotorcraft systems. Itsearches for optimum model parameters by comparing thefrequency domain responses from the proposed model tothe actual flight data. The most critical model-data fittingresults showing the main channel responses are brieflyshown here. In Figs. 15–17, the solid lines and the dashedlines show the frequency response of the in-flight data andthe fitted model respectively. It can be seen that the modelfits the flight test data very well, indicating a good fidelity ofthe derived parameters.

3.2. Control structure

In control engineering, the divide-and-conquer strategy isusually used when a relatively complex system needs to behandled. In flight control engineering, a natural decompo-sition of the full-order dynamic model of a helicopter isbased on motion types, i.e., rotational motion and transla-tional motion. In general, the dynamics of rotational motionis much faster than that of the translational motion, whichmakes them severable in the frequency domain. Hence, theoverall control system can be formulated in a dual-loopstructure, so that the inner-loop and outer-loop controllerscan be designed separately. Moreover, the linearized modelof the single rotor helicopter system is found to be of non-minimum phase if the two motion dynamics are combinedtogether. This non-minimum phase characteristics willhighly complicate the control problem and needs to beavoided.

−20

0

20

40

60

Magnitude(DB)

−270

−180

−90

0

90

Phase (Deg)

10−1

100

101

102

0.2

0.4

0.6

0.8

1

Frequency (Rad/Sec)

Coherence

Fig. 15. Frequency-domain model fitting: �lat to p.

−20

0

20

40

60

Magnitude(DB)

−360

−270

−180

−90

Phase (Deg)

10−1

100

101

102

0.2

0.4

0.6

0.8

1

Frequency (Rad/Sec)

Coherence

Fig. 17. Frequency-domain model fitting: �rud to r.

−20

0

20

40

60

Magnitude(DB)

−360

−270

−180

−90

0

90

Phase (Deg)

10−1

100

101

102

0.2

0.4

0.6

0.8

1

Frequency (Rad/Sec)

Coherence

Fig. 16. Frequency-domain model fitting: �lon to q.

Development of an Unmanned Helicopter 71

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

For the inner loop, the controlled object covers the ro-tational motion of the helicopter body, the flapping motionof rotor blades and the stabilizer bar, as well as the dy-namics embedded within the head-lock gyro. The main taskof the inner-loop controller is to stabilize the attitude andheading of the helicopter in all flight conditions. In ourimplementation, the H1 control method is used to minimizethe disturbance from wind gusts. For the outer loop, thecontrolled object covers only the translational motion. Themain task is to steer the helicopter flying with reference to aseries of given locations. A robust and perfect tracking(RPT) approach is implemented to emphasize the timefactor. Figure 18 gives an overview of the dual-loop controlstructure.

3.3. Inner-loop control design

Although the full model of NUS2T-Lion is highly complicatedand nonlinear, it is verified by simulation that its innerdynamics, after linearization, is more or less invariant underdifferent non-acrobatic flight conditions. Hence, it is rea-sonable to design a feedback control law based on the lin-earized model of the inner-layer dynamics, while using thenonlinear model for verification purposes only. Besides, it isnoted that NUS2T-Lion falls into the category of small-scaleUAV helicopter which is quite vulnerable to environmentaldisturbances such as wind gusts. Hence, the H1 controlmethod, which is specifically developed to minimize outputerror caused by external disturbances, naturally becomesthe best choice. The linearized inner-dynamics model ofNUS2T-Lion can be represented by a ninth-order state spaceform as shown below:

x: ¼ Axþ Buþ Ewy ¼ C1xþ D1w

h ¼ C2xþ D2u

8<: ; ð2Þ

where x is the state, y is the measurement output, h is thecontrolled output, u is the input and w is the wind distur-bance. More specifically,

x ¼ ½� � p q r as bs �ped;int�T; ð3Þ

u ¼ ½�lat �lon �ped�T; ð4Þw ¼ ½uwind vwind wwind�T; ð5Þ

A ¼ ½09�3 j �A�; ð6Þwhere

�A ¼

1 0 0 0 0 00 1 0 0 0 00 0 1 0 0 0

0 0 0 620:5 0 00 0 0 0 327:6 0

0 0 �13:5 0 0 165:6

�1 0 0 �5:41 6:45 0

0 �1 0 �3:72 �5:41 00 0 �1 0 0 0

2666666666666664

3777777777777775

; ð7Þ

B ¼

0 0 00 0 00 0 00 0 00 0 0

0 0 �54:69

2:975 �0:3 00:780 3:23 00 0 �4:46

2666666666666664

3777777777777775

; ð8Þ

E ¼

0 0 00 0 00 0 0

�0:0001 0:1756 �0:03950 0:0003 0:0338

�0:0002 �0:3396 0:64240 0 00 0 0

0 0 0

2666666666666664

3777777777777775

: ð9Þ

As the onboard IMU can provide measurements of the firstsix state variables, C1 can be formed accordingly and D1 canbe left as a zero matrix. C2 and D2 constitute weighting

Outer-loop

Controller

Inner-loop

CommandGenerator

Inner-loop

Controller

Inner-layer Dynamics

Outer-layerDynamics

P V

P

V

Fig. 18. Control structure of NUS2T-Lion.

72 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

parameters specifying the control objective, and usuallyneed to be tuned for practical implementation. For this case,they are in the following form, which considers the first sixstate variables and the three control inputs

C2 =

c1 0 0 0 0 0 0 0 00 c2 0 0 0 0 0 0 00 0 c3 0 0 0 0 0 00 0 0 c4 0 0 0 0 00 0 0 0 c5 0 0 0 00 0 0 0 0 c6 0 0 0

03×9

, ð10Þ

D2 =

06×3

d1 0 00 d2 00 0 d3

. ð11Þ

The H1 control problem is to find an internally stabi-lizing proper measurement feedback control law,

v: ¼ Acmpv þ Bcmpyu ¼ Ccmpv þ Dcmpy

(; ð12Þ

such that the H1-norm of the overall closed-loop transfermatrix function fromw to h is minimized. According to [16],the minimum H1-norm, � �, can be exactly computed usingsome numerical algorithms. However, it is almost impossibleto find a control lawwith finite gain to achieve this particularoptimal performance. Usually, anH1 suboptimal controller isdesigned, resulting in a suboptimal H1-norm smaller than �,where � > � �. It is also proved in [16] that when the sub-system ðA;E; C1;D1Þ is left invertible and of minimum phase,which is exactly the case for NUS2T-Lion, the optimalachievableH1 control performance under the state feedbackand the measurement feedback are identical. In other words,it is appropriate to design the state feedback control law andthe observer separately for the inner loop of NUS2T-Lion.Moreover, only a reduced-order observer is needed to esti-mate the threeunmeasurable state variables, i.e., as,bs, �ped;int.

To design an H1 reduced-order output feedback controllaw for the inner-dynamics of NUS2T-Lion, the originalsystem (2) can be rewritten as follows:

x:1

x:2

!¼ A11 A12

A21 A22

� �x1x2

� �þ B1

B2

� �uþ E1

E2

� �w

y0y1

� �¼ 0 C1;02

I 0

� �x1x2

� �þ D1;0

0

� �w

h ¼ ½C2;1 C2;2�x1x2

� �þ D2u

8>>>>>>>><>>>>>>>>:

: ð13Þ

In this form, the original state x is partitioned into a mea-surable state x1 and an unmeasurable state x2; y is parti-tioned into y0 and y1 with y1 � x1. If we define an auxiliarysubsystem characterized by a matrix quadruple ðAR;ER;CR;DRÞ, where

ðAR;ER; CR;DRÞ ¼ A22;E2;C1;02A12

� �;

D1;0

E1

� �� �; ð14Þ

then the following procedures can be followed to design thereduced-order output feedback H1 controller:

Step 1: Construction of State Feedback Gain MatrixDefine an auxiliary system

x: ¼ Axþ Buþ Ewy ¼ x

h ¼ C2xþ D2u

8<: : ð15Þ

Select � > � � , compute its corresponding H1 �-suboptimalstate feedback gain matrix F.Step 2: Construction of Observer Gain Matrix

Define another auxiliary system

x: ¼ AT

Rxþ C TRuþ C T

2;2wy ¼ x

h ¼ E TRxþ DT

Ru

8<: : ð16Þ

Select a sufficiently small � > 0, compute its correspondingH1 �-suboptimal state feedback gain matrix FR and then letKR ¼ F T

R .Step 3: Construction of Output Feedback Controller

Partition F and KR as

F ¼ ½F1 F2�; KR ¼ ½KR0 KR1�; ð17Þin conformity with the partitioning of x and y, respectively.Now define

GR ¼ ½�KR0;A21 þ KR1A11 � ðAR þ KRCRÞKR1�;then the reduced-order output feedback controller is givenby (12), where

Acmp ¼ AR þ B2F2 þ KRCR þ KR1B1F2;

Bcmp ¼ GR þ ðB2 þ KR1B1Þ½0; F1 � F2KR1�;Ccmp ¼ F2;

Dcmp ¼ ½0; F1 � F2KR1�:Based on the above procedures while choosing an ap-

propriate set of C2, D2, a H1 reduced-order output feedbackcontroller can be determined. After several rounds of tun-ings, the final values for the weighting parameters in C2 andD2 are chosen to be

c1 ¼ 13; c2 ¼ 12; c3 ¼ 1; c4 ¼ 1; c5 ¼ 1; c6 ¼ 6

ð18Þ

Development of an Unmanned Helicopter 73

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

and

d1 ¼ 13; d2 ¼ 12; d3 ¼ 30: ð19ÞThe corresponding � � ¼ 0:2057 and we choose � ¼ 0:21,which results in the following �-suboptimal state feedbackgain matrix,

F ¼�0:9952 �0:1177 0:0017 �0:0271

0:1386 �0:9927 �0:0005 �0:0056

�0:0186 0:0096 0:0526 �0:0006

264

0:0098 0:0143 �1:8795 �0:5324 0:0457

�0:0467 �0:0043 0:0253 �1:8175 �0:0503

0:0026 0:2379 �0:0925 �0:0216 1:3287

375:

ð20Þ

The corresponding feed-forward matrix is calculated as

G ¼0:9952 0:1177 �0:0017

�0:1386 0:9927 0:0005

0:0186 �0:0096 �0:0526

24

35:

The last three unmeasurable state variables, denoted by x̂,can be estimated by an observer as follows:

x̂: ¼ �F x̂ þ �Gy þ �Hu ð21Þ

where

�F ¼�0:9952 �0:1177 00:1386 �0:9927 0

�0:0186 0 �28

24

35; ð22Þ

�G ¼0 0 0 �9:309 0:2404 0

0 0 0 �1:225 �5:106 0

0 0 0 0 0 �3:452

24

35; ð23Þ

�H ¼2:975 �0:3 00:780 3:23 00 0 4:78

24

35: ð24Þ

With this set of gain matrices, the inner-loop system is stablewith a bandwidth of 2.95 rad/s for the roll angle dynamics,2.8 rad/s for the pitch angle dynamics and 2.17 rad/s for theyaw angle dynamics.

3.4. Outer-loop control design

For the outer loop, an RPT controller is designed to let theUAV track any 3D trajectories precisely. The controllerstructure and design techniques are adopted from [17].By perfect tracking, it means the ability of the controlled

system to track a given reference with arbitrarily fast set-tling time subjected to disturbances and initial conditions.Considering the following linear time invariant system

§ :

x: ¼ Axþ Buþ Ewy ¼ C1xþ D1w

h ¼ C2xþ D2uþ D22w

8<: ; ð25Þ

with x;u;w; y;h being the state, control input, disturbance,measurement and controlled output respectively, the task ofan RPT controller is to formulate a dynamic measurementcontrol law in the form of

v: ¼ Acð"Þv þ Bcð"Þy þ G0ð"Þr þ � � � þ G��1ð"Þr��1;

u ¼ Ccð"Þv þ Dcð"Þy þ H0ð"Þr þ � � � þ H��1ð"Þr��1;

so that when a proper "� > 0 is chosen,

(1) The resulted closed-loop system is asymptotically stablesubjected to zero reference.

(2) If eðt; "Þ is the tracking error, then for any initial con-dition x0, there exists:

kekp ¼Z 1

0jeðtÞpjdt

� �1=p

! 0; as "! 0:

For nonzero references, their derivatives are used togenerate additional control inputs. Thus, any reference ofthe form of rðtÞ ¼ p1t k þ p2t k�1 þ � � � þ pkþ1 are covered inthe RPT formulation. Furthermore, any references that havea Taylor series expansion at t ¼ 0 can also be tracked usingthe RPT controller.

Similar to the case introduced in [18], the outer dy-namics of NUS2T-Lion is differentially flat. That means all itsstate variables and inputs can be expressed in terms ofalgebraic functions of flat outputs and their derivatives. Aproper choice of flat outputs could be

� ¼ ½x y z �T: ð26ÞIt can also be observed that the first three outputs, x, y

and z, are totally independent. In other words, we canconsider the UAV as a mass point with constrained velocity,acceleration, jerk, and so on in the individual axes of the 3Dglobal frame when designing its outer-loop control law andgenerating the position references. Hence, a stand-aloneRPT controller based on a double integrator model in eachaxis can be designed to track the corresponding reference inthat particular axis. For each axis, the nominal system canbe written as

x:n ¼

0 1

0 0

" #xn þ

0

1

" #un

yn ¼ xn

8><>: : ð27Þ

To achieve better tracking performance, it is commonto include an integrator to ensure zero steady state error

74 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

subjected to step inputs. Thus, the RPT controller proposedhere is with integral action. This requires an augmentedsystem to be formulated as

x:o ¼

0 �1 0 0 1 00 0 1 0 0 00 0 0 1 0 00 0 0 0 0 00 0 0 0 0 1

0 0 0 0 0 0

266666664

377777775xo þ

00000

1

266666664

377777775uo

yo ¼ xoho ¼ ½1 0 0 0 0 0�xo

8>>>>>>>>>>>><>>>>>>>>>>>>:

; ð28Þ

where xo ¼ ½R ðpeÞ pr vr ar p v�T with pr; vr; ar as the posi-tion, velocity and acceleration references, p, v as the actualposition and velocity and pe ¼ rp � p as the tracking errorof position. By following the procedures in [16], a linearfeedback control law of the form below can be acquired,

uo ¼ Foxo; ð29Þwhere

Fo ¼ki!

2n

"3!2

n þ 2�!nki"2

2�!n þ ki"

1 � !2n þ 2�!nki

"2� 2�!n þ ki

"

�: ð30Þ

" is a design parameter to adjust the settling time of theclosed-loop system. !n; �; ki are the parameters that deter-mines the desired pole locations of the infinite zero struc-ture of (28) through

piðsÞ ¼ ðsþ kiÞðs2 þ 2�!nsþ !2nÞ: ð31Þ

Theoretically, when the design parameter " is smallenough, the RPT controller gives arbitrarily fast response.However, in real life, due to the constraints of the UAVphysical dynamics and its inner-loop bandwidth it is saferto limit the bandwidth of the outer loop to be one-fifth toone-third of the controlled inner-loop system. For the caseof NUS2T-Lion case, the following design parameters areused for different axes:

x; y :

" ¼ 1!n ¼ 0:707

� ¼ 0:707

ki ¼ 0:25

8>><>>: z :

" ¼ 1!n ¼ 0:99

� ¼ 0:707

ki ¼ 0:29

8>><>>: :

3.5. Inner-loop command generator

We have designed the inner-loop and the outer-loop con-trollers separately to avoid the non-minimum phase prob-lem and to relieve task complexity. As the inner-loopdynamics is designed much faster than that of the outer

loop, it can be treated as a non-dynamic static gain matrixwhen viewed from outside. However, the output from theouter-loop controller in physical meaning is the desiredaccelerations in the global frame, ac, while the inner-loopcontroller is looking for attitude references (�c, �c, c).Obviously, a global-to-body rotation followed by a commandconversion is needed. Moreover, the body-axis accelerationab does not mean anything to the heading direction refer-ence c. Therefore, unlike the other two attitude anglereferences (�c, �c), c is not involved in this conversion, butgenerated independently. In addition, the acceleration ref-erence in the UAV body z-axis directly links to the neededcollective control input �col, which is not manipulated by theinner loop at all. Based on the above ideas, if Ga is thesteady-state gain matrix from the inputs (�col, �c, �c) to theUAV body-frame accelerations, then we can get an approx-imated conversion matrix Gc as its inverse. So,

ð�col �c �cÞT ¼ Gcab;c ¼ G�1a ab;c: ð32Þ

Note that Ga must be non-singular. Otherwise, it means abcannot be manipulated by the control inputs ucol, ulat , ulon.For the case of NUS2T-Lion,

Gc ¼0 0 0:0523

0 0:1022 0

�0:1022 0 0

24

35: ð33Þ

4. Navigation

The previous section has solved the control problem ofNUS2T-Lion. However, it assumes that all the measurementsneeded by the control law are available and reliable, whichis too ideal and rarely happens in practical situations. Al-though the onboard GPS/INS sensor can provide rich in-formation including the UAV's global position, velocity andattitude angles, there is still missing information or unac-ceptable amount of measurement noises for the UAV to doprecise loading and unloading of cargoes on the movingplatforms. As the moving platforms are actually set up tosimulate cargo ships, they will be called `ships' in the fol-lowing context to avoid ambiguity from the term `platform'which is used for both the UAV platform and the cargoplatform.

In order to synchronize with the motion of the ships, thecontrolled UAV needs to know at every moment how theships are moving. A simple and reliable solution is to installanother GPS/INS sensor on the ship and send its informa-tion to the UAV onboard system. By doing so, the UAV canbe controlled in a ship-referenced frame instead of theglobal frame. In this ship-referenced frame, or called shipframe for simplicity, a zero steady-state position and

Development of an Unmanned Helicopter 75

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

velocity tracking error means the UAV is controlled rightabove the ship with the same velocity as the ship.

However, it should be noted that this zero error is judgedby the measurement difference of the two GPS/INS sensors.By considering their respective circular error probable, themeasurement error sometimes goes beyond 6m in all threeaxes. Obviously, it is not accurate enough for the cargotransportation task. In this section, the UAV z-axis mea-surement will be first complemented by fusing the infor-mation from a scanning laser range finder. As the heightinformation extracted from the laser scanner is availableand reliable for all time, it can be consistently fused in toimprove the navigation accuracy in the z-axis. On the otherhand, the x- and y-axis measurement errors are more diffi-cult to be reduced in a navigation sense because there is noother sensor which can provide a consistent and accuratemeasurement in these two axes. However, after realizingthat the best tracking performance is not really neededthroughout the mission, but only required at the loadingand unloading instances, they can be compensated in aguidance sense when the target enters the view angle of theonboard camera. This vision-based guidance will be dis-cussed later in Sec. 5. Here, we only explain the ship-framenavigation, height calculation via laser scanner, and heightmeasurement fusion.

4.1. Navigation in ship frame

As measurements provided by GPS/INS sensors on the UAVand on the ship are defined in the same global frame andthe motion of the ship involves no rotation, it is adequate toconvert all position, velocity and acceleration measure-ments into the ship frame by simple subtraction. So,

p ¼ Rs=gðpuav � pshipÞv ¼ Rs=gðvuav � vshipÞa ¼ Rs=gðauav � ashipÞ

8>><>>: : ð34Þ

If we refer back to Fig. 18, the outer-loop measurementsand references are now both represented in the ship frameinstead of the NED frame. Following this convention, it isalso straight forward to convert the UAV heading angle tothe ship frame as well. Hence,

¼ uav � ship: ð35ÞBy redefining this way, the original rotational matrix Rb=n

(rotation from the NED frame to the UAV body frame) canbe substituted by Rb=s, which is the rotation from the shipframe to the UAV body frame. Note that � and � in therotational matrix are still the UAV roll and pitch angles aswe assume that the ship has almost zero roll and pitchangles.

4.2. Height calculation via laser scanner

As mentioned previously, a very accurate height measure-ment is needed for the cargo loading and unloading tasks. Infact, it is also the most helpful information for the UAV tocarry out autonomous taking-off and landing. Motivated bythis, a high-end scanning laser range finder is installedonboard of the UAV platform. The corresponding algorithmto calculate the UAV height via its range measurements isexplained below.

For each frame of scanning, the laser sensor will output1081 integer numbers representing the measured distancesin millimeter from its starting point on the right to the endpoint on the left sequentially. Each distance data is associ-ated with its own angle direction, thus the data can be seenas in polar coordinates. A simple transformation can beapplied to the raw measurement data to convert it frompolar coordinates (ri; �i) to Cartesian coordinates (xi; yi) by

xi ¼ ri cos �i

yi ¼ ri sin �i

(; ð36Þ

where i 2 f1; 2; 3; . . . ; 1081g is the index of the laserscanner measurements. Then, the split-and-merge algorithm[19] is applied to this array of 2D points so that they can bedivided into clusters, with each cluster of points belongingto an individual line segment. The main steps of the split-and-merge algorithm is summarized below with Fig. 19giving a graphical illustration.

(a) Connect the first point A and the last point B of theinput data by a straight line.

(b) Find point C among all data points that has the longestperpendicular distance to line AB.

(c) If this longest distance is within a threshold, then acluster is created containing points in between A and B.

(d) Else, the input points will be split into two subgroups,A-C and C-B. For each group, the split-and-merge algo-rithm will be applied recursively.

Fig. 19. The split-and-merge algorithm for line extraction.

76 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

Clusters of points will be created thereafter. Leastsquare line fitting algorithm can then be applied to pointsin each cluster to obtain the individual lines. Each line canbe represented by two parameters, namely the line'snormal direction k and its perpendicular distance to thecenter of laser scanner dk. In the last sub-figure of Fig. 19,xy axes represent the laser scanner frame. Normal direc-tion of the line is defined as the angle from the x-axis tothe line normal, counterclockwise as positive. The nextstep is to filter out lines with dissimilar gradient as theground plane. Since the obtained lines are still expressedin the laser scanner frame, their directions k should becompensated by the UAV roll angle � and then comparedwith the normal line of the ground plane which is at�=2. Let

Δk ¼ k � �� �=2: ð37Þ

If jΔij is greater than a threshold, say 5�, then the cor-responding line is filtered out. The remaining lines aresorted by their perpendicular distances to the laserscanner and the furthest ones are kept. Among them, thelongest line is believed to be the true ground. Finally, theperpendicular distance of this line to the laser scannercenter is compensated with the UAV pitch angle � and theoffset between the laser scanner and the UAV CG, Δh,leaving the final height estimation to be

h ¼ r cosð�Þ � Δh: ð38Þ

Figure 20 has shown the flow chart of the laser scannerbased height calculation algorithm. By using this method, anaccurate height measurement can be obtained as long as thelaser scanner projects a portion of its laser beams ontothe true ground. Hence, it even works for the case whenthe UAV flies over protruding objects on the ground oron the ship surface.

4.3. Height measurement fusion

Since there are two sources of height measurements, onefrom GPS/INS and the other from laser scanner, it is bestto combine them so that the most reliable and accurateUAV state variables in the z-axis, i.e., xh ¼ ½z wg az;g �z�T,can be obtained. Here, z is the UAV vertical height withrespect to the ground surface, wg and az;g are the corre-sponding velocity and acceleration in this axis and �z isthe offset between the GPS/INS height and the lasercounterpart. This offset has to be considered because thetwo sensory systems have different zero references and italso accounts for the time-varying position bias of theGPS/INS sensor. Here, we also formulate the estimator byconsidering the physical dynamics of a single-axis masspoint system as:

x:h ¼ Ahxh þ Ehwh

yh ¼ Chxh þ vh

�; ð39Þ

where

Ah ¼0 1 0 00 0 1 00 0 0 00 0 0 0

2664

3775; Eh ¼

0 00 01 00 1

2664

3775;

Ch ¼1 0 0 01 0 0 10 1 0 0

0 0 1 0

2664

3775

ð40Þ

and wh, vh are Gaussian noises with covariance matricesQh and Rh respectively. Qh and Rh can be chosen by ana-lyzing signal noise levels logged in UAV hovering flighttest with the assumption that all measurements areGaussian and independent of each other. In our imple-mentation, they are set as:

Qh ¼0:1 0

0 0:01

" #2

; Rh ¼

0:05 0 0 0

0 0:8 0 0

0 0 0:5 0

0 0 0 0:3

266664

377775

2

:

ð41ÞBy discretizing the system at a sampling rate of 50Hz

and applying Kalman filter, a reliable estimation of UAVheight can be obtained. In implementing this filter, an ad-ditional technique is utilized to discard occasional outliersin the height measurement from the laser scanner. The ideais to check whether the received measurement is within athreshold multiply of the current process noise. If the dis-crepancy is too large, then the measurement at this par-ticular instance is ignored. In [20], a similar technique is

Cluster points viasplit and merge

Extract line parameters from each cluster

Filter out lines withwrong angle direction

Find the furthest lineKeep lines that are close to the furthest

line

Find the longest line among the remaining

Extract the perpendicular distance between the line and

the laser scanner

Compensate the distance with the UAV

attitude and the vertical offset

Return result

Fig. 20. Steps to compute height via laser scanner measurement.

Development of an Unmanned Helicopter 77

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

introduced and it is called the innovation filter. Figures 21and 22 show the height estimation result via data collectedin one of the flight tests. It can be seen that the fused resulthas higher quality than the original height information fromGPS/INS or laser scanner alone. The problem of slowdrifting of GPS/INS (see Fig. 21) and a few small outliersfrom laser height measurement (see Fig. 22) are not af-fecting the fused result too much. At the same time, theestimated values of UAV vertical velocity and accelerationare also less noisy than their respective raw measurementsfrom GPS/INS (see Figs. 23 and 24).

5. Guidance

After the control and navigation problems have been solved,the next task is to guide the UAV to do meaningful move-ments by generating mission oriented reference trajecto-ries. There are two major problems here. One is how to usevision as a guidance sensor to locate the target positions. Inthis problem, the target localization result must be correctand accurate enough for the UAV to carry out preciseloading and unloading actions. The other problem is how togenerate a continuous trajectory linking the UAV currentstatus to the final destination by considering the physicallimitations of the UAV dynamics. This is to make sure that

20 40 60 80 100 120−2

0

2

4

6

8

10

Time (s)

Hei

ght e

stim

atio

n (m

)

LaserGPS/INSFusion

Fig. 21. Result of height estimation by data fusion.

20 40 60 80 100 120−3

−2

−1

0

1

2

3

Time (s)

Ver

tical

acc

eler

atio

n es

timat

ion

(m/s

2 )GPS/INSFusion

Fig. 24. Result of vertical acceleration estimation by data fusion.

20 40 60 80 100 120−1.5

−1

−0.5

0

0.5

1

Time (s)

Ver

tical

vel

ocity

est

imat

ion

(m/s

)

GPS/INSFusion

Fig. 23. Result of vertical velocity estimation by data fusion.

42 44 46 48 50 52 54 56 58

−1.6

−1.4

−1.2

−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

Time (s)

Hei

ght e

stim

atio

n (m

)

LaserGPS/INSFusion

Fig. 22. Result of height estimation by data fusion (zoomed in).

78 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

the generated reference is smooth enough, while satisfyingthe UAV outer-loop bandwidth requirements. In the fol-lowing content, the solutions to these two problems will bepresented accordingly.

5.1. Vision-based target localization

Vision-based target detection and localization are usuallymission dependent. In the context of the UAVGP require-ments, the cargoes to be transported are in the form ofplastic buckets located inside circular patterns drawn onthe ship surfaces. The first idea naturally comes into mind isto use image processing to detect circles and then do circle-based pose estimation. However, it should be noted thatafter camera projection, circles become ellipses in an image.Hence the main task of the vision sub-system here is to firstselect the correct target ellipse in the captured image andthen estimate the 3D pose of the actual circle on the shipsurface. The flow chart of the vision system is given inFig. 25.

Before zooming into the detailed steps, three key algo-rithms in this vision system need to be highlighted. They areellipse detection, ellipse tracking and single-circle-based poseestimation. These three algorithms are not restricted to thespecific UAVGP competition tasks, but also applicable tomany other UAV guidance tasks such as circle-based targetfollowing and circle-based landing.

(a) Ellipse detection has been investigated extensively inliterature. Ellipse fitting, introduced in [21, 22]] is chosenas the core ellipse detection algorithm in this work, be-cause it is very efficient compared to hough transformbased ellipse detection algorithms proposed in [23, 24].Unfortunately, ellipse fitting only fits the best ellipse for agiven contour without questioning whether the contour issuitable to be seen as an ellipse in the first place. Tocomplement its shortage, a three-step procedure, con-sisting of pre-processing, ellipse fitting and post-proces-sing, is proposed and implemented for real-time androbust ellipse detection. The pre-processing is based onaffine moment invariants (AMIs) [25], while the post-processing is based on the algebraic error between thecontour and the fitted ellipse. The three-step procedure isnot only robust against non-elliptical contours, but alsocan handle partial occlusion cases.(b) Ellipse tracking is to continuously track a single ellipseafter its detection has been initialized. In practical applica-tions, multiple ellipses may be detected in an image butonly one of them is to be targeted. There are two mainchallenges for ellipse tracking. First, the areas enclosed bythe ellipses (the interested one and the others) are similarto each other in both shape and color. Thus, templatematching based on color, shape or feature points may not besuitable for this task. Second, when implementing vision-based tracking algorithms on a flying UAV, the fast dy-namical motion of the UAV may cause large displacementof the target ellipse between two consecutive images. Inorder to track the target ellipse smoothly, the frame rate ofthe image sequence must be high, which requires a veryefficient implementation of the vision algorithm. To bestsolve these problems, an efficient image tracking methodCAMShift [26] is chosen as the core of the tracking algo-rithm in this work. This algorithm can robustly track thetarget ellipse even when the scale, shape or color of theellipse area are dynamically changing.(c) Single-circle-based pose estimation is to calculate the 3Dposition of the target circle after its projected ellipse on the2D image has been identified and tracked. Circle-basedcamera calibration and pose estimation have been studiedin [27–31]. However, these existing work mainly focused onthe cases of concentric circles [29–31], but our aim is to dopose estimation via only one circle. Theoretically, it is im-possible to estimate the pose of a single circle purely fromits perspective projection [28]. But from a practical point ofview, the single-circle-based pose estimation problem canbe solved by adopting a reasonable assumption that theimage plane of the camera is parallel to the plane thatcontains the circle. This assumption is satisfied in our workbecause the onboard camera is installed on a pan-tiltmechanism which can ensure the image plane to be alwaysparallel to the ground plane. By exploiting this assumption,

Image (640x480 pixel)

Ellipse detection

If doneinitialization

Select a targetellipse arbitrarily

If can detecttwo ships

Select a targetellipse based onellipse tracking

Select a target ellipsebased on two ships

Image pre-processing

Pose estimation from the target ellipse

Yes

No

Yes

No

Cluster the ellipses

Fig. 25. Flow chart of the vision system.

Development of an Unmanned Helicopter 79

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

the 3D position of the targeted circle can be estimated fromits projection as a single ellipse in the image.

More detailed explanations and discussions about thesevision algorithms are documented in another paper due toits own research significance in the vision society [32]. Wenext explain the steps shown in Fig. 25.

(a) Image — Color images are captured at 5Hz by the on-board camera. The image resolution is 640� 480pixels.

(b) Image pre-processing — The purpose of this block is todetect all the possible contours that may be formed bythe projected circles. It consists of four sub-steps,namely image un-distortion, RGB to HSV conversion,color thresholding and contour detection. Since thecolor information of the circles has been given in thecompetition, the contours of the circles can be effi-ciently detected based on color thresholding in the HSVcolor space. It is also possible to detect the contoursusing other methods such as edge detection. However,they will consume more computational power.

(c) Ellipse detection — The perspective projections of thecircles are ellipses. Based on the contours given in theprevious step, the ones that correspond to ellipsesshould be detected.

(d) Ellipse clustering — The main aim for ellipse clusteringis to decide whether the two ships are in the camera'sfield of view. If they are, there should be two clusters ofellipses. Since the four ellipses on each ship are of thesame size and they are distributed evenly in a line, thesize and position information can be used for ellipseclustering. If two clusters are obtained, then we canconclude that the two ships are in the field of view.

(e) Initialization — The UAV takes off at a location far fromthe ships and is guided to the two ships based onGPS. Once the vision system can detect two ships, an

initialization procedure will be triggered. The purposeof the initialization procedure is to select a propertarget circle according to the UAV's current task. Thetracking algorithm is also initialized in this step.

(f) Select a target ellipse arbitrarily — This is a failsafemechanism. After the UAV takes off, it will be guided tothe ship area by GPS. However, the position measure-ment from GPS is not very precise. Hence, the UAV maynot be guided to the exact position at which both shipsare in the onboard camera's field of view. To handle thiskind of situation, the vision system will return anydetected ellipse until the initialization condition hasbeen detected.

(g) Select a target ellipse based on two ships — If two shipsare in the field of view, we will select the target ellipsebased on the knowledge of the initial placements of thecargoes and the current task status. Suppose thebuckets are initially placed on the right ship and theyare required to be taken to the left one. If the currentmission for the UAV is to grab a bucket, the visionsystem will select one circle that contains a bucket fromthe right ship. If the current mission for the UAV is todrop a bucket, the vision system will select one circlethat does not contain a bucket from the left ship.

(h) Select a target ellipse based on ellipse tracking — Inmany of the cases, the two ships may not be in the fieldof view simultaneously. Then ellipse tracking algorithmis used to track a target ellipse over the image sequence.

(i) Pose estimation from the target ellipse — Once thetarget ellipse is selected in any of the ways mentionedabove, the ellipse will be used to estimate the positionof the circle center relative to the camera. The detailedmethod is explained in [32].

Figure 26 shows a number of consecutive images takenby the onboard camera. All the detected ellipses have been

Fig. 26. Onboard images with the ellipse detection and tracking result.

80 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

drawn on each image. The green ellipse is the target ellipsetracked by the vision algorithm. The yellow ellipse is thearea of interest returned by the CAMShift algorithm. It canbe seen that all the ellipses have been successfully detected.The target ellipse is also tracked steadily even when itsscale and shape keep varying.

5.2. Trajectory planning

When the target location has been precisely obtained inthe camera frame, say Tc, it needs to be transformed to theship orientation, and the position offset between thecamera and the UAV CG needs to be compensated. Thisleads to the ship-frame target location relative to the UAVexpressed as:

Ts ¼ Rs=cTc � Rs=b~Tb; ð42Þ

where ~Tb is the camera's position in the UAV body frame.With the first two elements in Ts known, say tx and ty , theUAV x- and y-axis trajectory planning can be carried outindependently. For each axis, the trajectory planning al-gorithm needs to know the current UAV velocity v0, themaximum allowed velocity vmax, the maximum allowedacceleration amax and the displacement from the currentposition to the final position S. In this specific case, S ¼ txor S ¼ ty. Figure 27 illustrates the fundamental idea of theproposed trajectory planning algorithm with the followingcharacteristics:

(a) The resulting position reference is continuous andsmooth.

(b) The resulting velocity reference is continuous.(c) The resulting acceleration reference is non-continuous

and only have three discrete possibilities, amax, �amax

and 0.(d) The trajectory can start from a nonzero velocity but

always ends at zero velocity.(e) The area under the velocity profile from time 0 to T

integrates to the total displacement.

Viewing Fig. 27 in a geometrical way, one can get the fol-lowing two relationships:

S ¼ S1 þ S2 þ S3

¼ 12ðv0 þ v0 þ a1 t1Þt1 þ ðv0 þ a1 t1Þðt2 � t1Þ

þ 12ðv0 þ a1 t1ÞðT � t2Þ; ð43Þ

v1 ¼ v0 þ a1t1 ¼ �a2ðT � t2Þ: ð44ÞIf v0, a1, a2 and T are known, t1 can be solved in a quadraticequation form:

At t21 þ Bt t1 þ Ct ¼ 0; ð45Þ

where

At ¼ a21 � a1a2;

Bt ¼ 2v0a1 þ 2a1a2T;

Ct ¼ v 20 þ 2a2v0T � 2a2S:

8><>: ð46Þ

Define Dt ¼ B2t � 4AtCt , then

t1 ¼ �Ct=Bt if At ¼ 0;

t1 ¼ �Bt �ffiffiffiffiffiDt

p2At

if AtCt > 0 & AtBt < 0 & Dt > 0;

t1 ¼ �Bt þffiffiffiffiffiDt

p2At

if AtCt < 0 & Dt > 0;

t1 ¼ �1 if otherwise;

8>>>>>>>>><>>>>>>>>>:

ð47Þ

and correspondingly,

t2 ¼ T þ v0 þ a1t1a2

: ð48Þ

However, a1, a2 and T are not known exactly. To solvethis problem, a recursive algorithm by listing all four casesof a1–a2 combination and continuously increasing T by 1-second step is proposed in Fig. 28. The iteration stops untila feasible solution occurs.

In actual implementations, this trajectory planning al-gorithm runs at 1 Hz only because it consumes highcomputational power with respect to an embedded com-puter. In addition, instead of inputting v0 as the currentvelocity measurement, we use the current velocity refer-ence, and instead of accumulating on the current positionmeasurement for future position reference, we accumulateon the current position reference. This is to make sure thatthe velocity and position reference signals are always con-tinuous. In fact, it is quite reasonable because at a slow rateof trajectory planning with strict velocity and accelerationconstraints, the UAV's actual position, velocity and accel-eration should be settled to more or less the same value oftheir corresponding reference signals at every instance thetrajectory planning algorithm is called.Fig. 27. Trajectory planning with continuous velocity.

Development of an Unmanned Helicopter 81

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

6. Software Realization

In order to implement the aforementioned GNC algorithmsand to solve the mission logics in completing the UAVGPtasks, a real-time software system was developed. The fol-lowings will show the key features about NUS2T-Lion'ssoftware system.

6.1. Software structure

The software structure ofNUS2T-Lion is illustrated inFig. 29. Itconsists of three separate software programs, namely the on-board control software, the onboard vision software and theGCS software. For the onboard control and vision software,they both utilize the multiple-thread framework, so that timeresource can be precisely distributed among different func-tional blocks (threads). It is also a more reliable way ofimplementing real-time UAV software so that the malfunctionof individual thread will not halt the executing of others.

The onboard control software is developed using a real-time operating system (RTOS), namely QNX Neutrino, whichprovides reliable support for high precision timer andsynchronization operations. Multiple tasks (threads), in-cluding operating with hardware devices like the navigationsensor, the Lidar and the servo control board, implementingthe automatic control algorithms, logging in-flight data andcommunicating with GCS and the vision computer, are

Fig. 29. Software structure of NUS2T-Lion.

Solve , for

Solve , for

Solve , for

Solve , for

Initialize, ,

,

END

Yes

Yes

Yes

Yes

No

No

No

No

Fig. 28. Flowchart of the trajectory planning algorithm.

82 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

managed by the MAIN function. With the multiple-threadframework, it is also easy to run different threads withdifferent frequencies. More details about this onboardcontrol software can be found in [33].

Similarly, the onboard vision software is also divided bymultiple tasks, namely image capturing from camera sensor,image processing, data logging and communication withGCS and the control computer. The operating system uti-lized on the vision computer is the very popular UbuntuLinux. It supports abundant hardware drivers such as USBcameras and WiFi adapters and software libraries such asOpenCV, which are very convenient for the development ofcomplicated vision algorithms.

For the GCS software, it runs on a laptop with Windows 7system. Such a commercial operating system providesstrong support for the development of user interfaces.Visual Cþþ is employed to develop and debug the GCSsoftware. By utilizing the MFC (Microsoft Foundation Class)library, the global shared data are hosted in a documentclass, in which a variety of functions for data operating andvisiting are integrated. While this document class is thekernel of the software program, there are also the com-munication thread and other threads to control the multipleviews at the foreground user interface. The communicationthread receive and send data to the UAV onboard systemthrough WiFi link and the multiple displaying views peri-odically visit the contents of the document and update theirrespective displaying of new data received.

6.2. Mission logics

As the UAV cargo transportation application is missionoriented, the sequence and logics of the whole task opera-tion are rather critical. It is implemented in the CTL threadof the onboard control software. An overview of the logicflow is illustrated in Fig. 30. There are six sequential tasks,namely taking off, navigating to ship, vision initialization,transporting cargos, returning home and landing. Since themission is time constrained, a timer interrupt is alsoimplemented. It triggers the return home task once a pre-defined maximum mission duration runs out. The details ofeach task is discussed as follows.

(1) The Take Off task will be triggered once the onboardsoftware program receives the `Mission Start Command'from the GCS. In this stage, the program let the helicopterwarms up its engine by issuing a small and constant valueto the throttle channel. After a while, the throttle channelcontrol signal will be increased gradually until the engineenters the governor mode (main blades will now be con-trolled at a predefined rotational speed of 1750 rpm). Afterthat, the program will slowly increase the control signal ofthe collective pitch channel so that the lift force increases.

Once the collective pitch signal hits its trimming value forthe hovering condition, the program will ask the referencegeneration function to issue a `going up' trajectory. At theend of the trajectory, the programs throws a `Take-off EventEnd' signal.(2) The software program now enters the Navigate to Shipmode. In this stage, the program collects the position andvelocity information from the GPS/INS system on the ship.A relative path to the ship with continuous relative position,relative velocity and relative acceleration references will begenerated. The flight controller will continuously asks thehelicopter to track this path so that the helicopter can catchup with the ship and have the same position and velocityprofiles as the ship at steady state. Once catching up withthe ship, the software will throw a `Navigation Event End'signal. Note that this decision is made based on GPS/INSinformation. Physically the UAV may not be hovering soprecisely above the center of the two ships.(3) In the Vision Initialization, the vision system will firstcheck whether it can detect two ships. If only one ship orpart of a ship has been detected, the vision system willguide the helicopter to move towards one of the detectedcircles. In this way, there will be very high chance to see theother ship by taking advantage of the onboard camera'swide viewing angle. Once both ships are successfullydetected, the software will be scheduled to the Transportingcargo mode.(4) The Transporting Cargo task is the most sophisticatedpart of the mission. In this stage, the UAV will choose one ofthe cargos and fly to a position right above it. When the UAVhorizontal position to the target cargo enters a smallthreshold, its height reference will be gradually adjusteddown to an appropriate value so that the mechanical clawcan grasp the bucket handle. Once the mechanical claw isclosed, the UAV will be commanded to fly upwards quicklyso that the limit switch sensors mounted under the clawplatform can sense whether the cargo has been successfullyloaded. If it senses that cargo has not been grasped suc-cessfully, the UAV will be commanded to go down again foranother try. The above procedure will be repeated until thelimit switch system detects a successful cargo loading. Afterthat, the helicopter will be commanded to move to theunloading point. For the unloading task, the UAV has asimilar procedure to check whether the cargo has beensuccessfully released. If the detection is false, the UAV willquickly close and open its claw to try another release. Forfailsafe, when the vision system loses the cargo target formore than 10 s during the `grasping' stage, the software willissue a `going up' command so that the vision system canhave a wider view, which leads to higher chance to retrievethe target. Once the vision system retrieves the target, theUAV will be commanded to go down and try grasping thecargo again. There is a counter to record how many cargos

Development of an Unmanned Helicopter 83

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

remain to be transported. Once the counter hits zero, theprogram will jump out the current mode and enter theReturn Home mode.(5) When the helicopter has finished all its transportationtasks or the maximum mission time runs out, the ReturnHome task will be triggered. The software will generate areference trajectory ending at a predefined height with theUAV's final planar position set to be the initial take-offpoint. The UAV will then follow this trajectory back to homelocation.(6) Landing will be triggered as the helicopter flies rightabove its home location. The procedure for the Landing taskis similar to the Take Off task. The software asks the flightcontroller to regulate the helicopter moving downwardswith a constant speed at 0.5m/s (if height is greater than5m) or 0.2m/s (if height is less than 5m). Once the UAV

Fig. 30. Mission logics.

Fig. 31. NUS2T-Lion in the International UAV Innovation GrandPrix.

84 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

landing gear approaches the ground (within 8 cm), thecontrol signal to the throttle channel will jumps to a mini-mum value so that the engine shuts down completely.

7. Experimental and Competition Results

In preparation for the UAVGP competition, numerous flighttests have been carried out to verify the overall solution andto tune for the optimal performance. Figure 31 shows a

snap shot of NUS2T-Lion going to grab the second bucket inthis competition. Figures 32–34 show the position datalogged in one of the flight tests. As the raw data is obtainedby GPS/INS and then converted to the ship frame, it may notbe the ground truth. However, it still shows the controlperformance in a general sense and indicates whether theUAV is doing the correct movement. In Fig. 32, the x posi-tion signal becomes larger progressively because the UAV ismoving from the first bucket to the fourth bucket. It alwayscomes back to a position around zero because the referencepath is purposed defined in a way that the onboard camerahas the best view of the two ships before every loading orunloading dive. In Fig. 33, the y position signal goes backand forth, indicating alternative movements between thetwo ships. In Fig. 34, it is clear to see all the diving motionsof the UAV. The UAV will stay at a very low altitude witha variable time duration depends on how many loadingor unloading trials have been performed until the finalsuccess one.

With this kind of performance, NUS2T-Lion has suc-cessfully accomplished the competition tasks in the UAVGProtary-wing category. A final score of 1127.56 with 472.44from the preliminary contest and 655.13 from the final hasmade the team second position in the overall Grand Prix. Infact, 655.13 is the highest score in the final round of thecompetition. It should be highlighted that unlike the pre-liminary contest, the final round of the competition requiresthe UAV to carry out the cargo transportation task with the`ships' moving. This demands for better robustness andhigher intelligence from the participants' UAV systems, andit is indeed the strongest point of the GNC solution proposedin this paper. The full process has been video-recorded and

300 350 400 450 500 550 600 650 700 750−3

−2

−1

0

1

2

3

4

5

6

x (m

)

Time (s)

MeasurementReference

Fig. 32. UAV position response in the ship-frame x-axis.

300 350 400 450 500 550 600 650 700 750

−10

−8

−6

−4

−2

0

z (m

)

Time (s)

MeasurementReference

Fig. 34. UAV position response in the NED-frame z-axis.

300 350 400 450 500 550 600 650 700 750−2

0

2

4

6

8

10

y (m

)

Time (s)

MeasurementReference

Fig. 33. UAV position response in the ship-frame y-axis.

Development of an Unmanned Helicopter 85

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

uploaded to [34] and [35] for the English and Chinese ver-sions, respectively.

8. Conclusions

In conclusion, this paper has presented a comprehensivedesign and implementation methodology in solving the ro-torcraft UAV cargo transportation problem. Despite thehardware innovation, the main contribution of this paper isabout the UAV guidance, navigation and control algorithmsin solving this practical problem. Different from the con-ventional GPS/INS based UAV navigation scheme, this so-lution also integrates a sophisticated vision system capableof locating the cargo loading and unloading positions. Bysetting up a second GPS/INS unit on the moving platformwith communication to the UAV onboard system, the con-trolled UAV is able to follow the dynamic platform withgood performance. The GNC algorithms are also backed upby a multi-layer multi-thread software system, thus imple-mented successfully. With this set of technologies, the ap-plication of UAV cargo transportation or good delivery ismore than realizable.

References[1] A. Bujak, M. Smolarek and A. Gebczy�nska, Applying military telematic

solutions for logistics purposes, in 11th Int. Conf. Transport SystemsTelematics (2011) 248–256.

[2] M. Bernard and K. Kondak, Generic slung load transportation systemusing small size helicopters, in IEEE International Conference onRobotics and Automation (2009), pp. 3258–3264.

[3] M. Bisgaard, A. la Cour-Harbo and J. D. Bendtsen, Adaptive controlsystem for autonomous helicopter slung load operations, ControlEng. Pract. 18(7) (2010) 800–811.

[4] M. Orsag, C. Korpela and P. Oh, Modeling and control of MM-UAV:Mobile manipulating unmanned aerial vehicle, J. Intell. Robotic Sys.69(1–4) (2013) 227–240.

[5] J. Thomas, J. Polin, K. Sreenath and V. Kumar, Avian-inspired graspingfor quadrotor micro UAVs, in IDETC/CIE, ASME, 2013.

[6] I. Maza, K. Kondak, M. Bernard and A. Ollero, Multi-UAV cooperationand control for load transportation and deployment, J. Intell. RoboticSyst. 57(1–4) (2010) 417–449.

[7] K. Kosuge and M. Sato, Transportation of a single object by multipledecentralized-controlled nonholonomic mobile robots, in IEEE Int.Conf. Intelligent Robots and Systems (1999), pp. 1681–1686.

[8] B. K. G. Mrdjan Jankovic, Visually guided ranging from observationsof points, lines, and curves via identifier based nonlinear observer,Syst. Control Lett. 25(1) (1995) 63–73.

[9] S. Saripalli, J. F. Montgomery and G. Sukhatme, Vision-based auton-omous landing of an unmanned aerial vehicle, in IEEE Int. Conf.Robotics and Automation (2002).

[10] F. Lin, X. Dong, B. M. Chen, K. Y. Lum and T. H. Lee, A robust real-time embedded vision system on an unmanned rotorcraft forground target following, IEEE Trans. Indust. Electron. 59 (2012)1038–1049.

[11] V. N. Dobrokhodov, I. I. Kaminer, K. D. Jones and R. Ghabcheloo,Vision-based tracking and motion estimation for moving targetsusing small UAVs, in American Control Conference (2006).

[12] P. Vela, A. Bester, J. Malcolm and A. Tannenbaum, Vision-based rangeregulation of a leader follwer formation, IEEE Trans. Control Syst.Technol. 17(2) (2009) 442–448.

[13] H. Lang, M. T. Khan, K.-K. Tan and C. W. de Silva, Developments invisual servoing for mobile manipulation, Unmanned Syst. 1(1) (2013)143–162.

[14] G. Cai, B. Chen and T. Lee, Unmanned Rotorcraft Systems (Springer,New York, 2011).

[15] F. Lin, K. Z. Y. Ang, F. Wang, B. M. Chen, T. H. Lee, B. Yang, M. Dong, X.Dong, J. Cui, S. K. Phang, B. Wang, D. Luo, K. Peng, G. Cai, S. Zhao, M.Yin and K. Li, Development of an unmanned coaxial rotorcraftfor the DARPA UAVForge Challenge, Unmanned Syst. 1(2) (2013)211–258.

[16] B. M. Chen, Robust and H1 control, in Communications and ControlEngineering Series (Springer, 2000).

[17] B. M. Chen, T. H. Lee and V. Venkataramanan, Hard disk drive servosystems, in Advances in Industrial Control Series (Springer, New York,2002).

[18] D. Mellinger and V. Kumar, Minimum snap trajectory generation andcontrol for quadrotors, in IEEE Int. Conf. Robotics and Automation(2011), pp. 2520–2525.

[19] G. A. Borges and M. J. Aldon, A split-and-merge segmentation algo-rithm for line extraction in 2d range images, in 15th Int. Conf. PatternRecognition (2000), pp. 441–444.

[20] S. Zhao, F. Lin, K. Peng, B. M. Chen and T. H. Lee, Homography-basedvision-aided inertial navigation of UAVs in unknown environments,in Proc. 2012 AIAA Guidance, Navigation, and Control Conference(2012).

[21] A. Fitzgibbon, M. Pilu and R. B. Fisher, Direct least square fittingof ellipses, IEEE Trans. Pattern Anal. Mach. Intell. 21 (1999)476–480.

[22] S. J. Ahn, W. Rauh and H.-J. Warnecke, Least-squares orthogonaldistances fitting of circle, sphere, ellipse, hyperbola, and parabola,Pattern Recogn. 34 (2001) 2283–2303.

[23] D. H. Ballard, Generalizing the hough transform to detect arbitraryshapes, Pattern Recogn. 13(2) (1981) 111–122.

[24] R. A. McLaughlin, Randomized hough transform: Improved ellipsedetection with comparison, Pattern Recogn. Lett. 19 (1998) 299–305.

[25] J. Flusser and T. Suk, Pattern recognition by affine moment invar-iants, Pattern Recogn. 26 (1993) 167–174.

[26] J. G. Allen, R. Y. D. Xu and J. S. Jin, Object tracking using CamShiftalgorithm and multiple quantized feature spaces, in Proc. Pan-SydneyArea Workshop on Visual Information Processing, Darlinghurst,Australia (2004), pp. 3–7.

[27] J. Heikkila and O. Silven, A four-step camera calibration procedurewith implicit image correction, in Proc. 1997 IEEE Conf. ComputerVision and Pattern Recognition San Juan, Puerto Rico (1997),pp. 1106–1112.

[28] G. Wang, J. Wu and Z. Ji, Single view based pose estimation from circleor parallel lines, Pattern Recogn. Lett. 29 (2008) 977–985.

[29] J.-S. Kim, P. Gurdjos and I.-S. Kweon, Geometric and algebraic con-straints of projected concentric circles and their applications tocamera calibration, IEEE Trans. Pattern Anal. Mach. Intell. 27(4)(2005) 637–642.

[30] G. Jiang and L. Quan, Detection of concentric circles for camera cal-ibration, in Proc. 10th IEEE Int. Conf. Computer Vision, Beijing, China,(2005), pp. 333–340.

[31] D. Eberli, D. Scaramuzza, S. Weiss and R. Siegwart, Vision basedposition control for MAVs using one single circular landmark, J. Intell.Robotic Syst. 61 (2011) 495–512.

86 F. Wang et al.

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.

[32] S. Zhao, Z. Hu, M. Yin, K. Ang, P. Liu, F. Wang, X. Dong, F. Lin, B. Chenand T. Lee, A robust real-time vision system for autonomous cargotransfer by an unmanned helicopter, Industrial Electronics, IEEETransactions, PP(99) (2014) 1–1.

[33] M. Dong, B. M. Chen, G. Cai and K. Peng, Development of a real-timeonboard and ground station software system for UAV helicopter,AIAA J. Aerospace Comp. Info. Comm. 4 (2007) 933–955.

[34] Unmanned Aircraft Systems Group, National University of Singapore,\The video of the final round competition of UAVGP (Englishversion)." http://www.youtube.com/watch?v¼Ex0jgtX2ZrY (2013).

[35] Unmanned Aircraft Systems Group, National University of Singapore,\The video of the final round competition of UAVGP (Chinese version)."http://v.youku.com/v show/id XNjI5MTM2MDI0.html (2013).

Development of an Unmanned Helicopter 87

Un.

Sys

. 201

5.03

:63-

87. D

ownl

oade

d fr

om w

ww

.wor

ldsc

ient

ific

.com

by N

AT

ION

AL

UN

IVE

RSI

TY

OF

SIN

GA

POR

E o

n 01

/30/

15. F

or p

erso

nal u

se o

nly.