c consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf ·...

11
This may be the author’s version of a work that was submitted/accepted for publication in the following source: Mcfadyen, Aaron & Mejias Alvarez, Luis (2015) Design and evaluation of decision and control strategies for autonomous vision-based see and avoid systems. In Sabatini, R, Theilliol, D, & Saripalli, S (Eds.) Proceedings of the 2015 International Conference on Unmanned Aircraft Systems, ICUAS. IEEE, United States of America, pp. 607-616. This file was downloaded from: https://eprints.qut.edu.au/85097/ c Consult author(s) regarding copyright matters This work is covered by copyright. Unless the document is being made available under a Creative Commons Licence, you must assume that re-use is limited to personal use and that permission from the copyright owner must be obtained for all other uses. If the docu- ment is available under a Creative Commons License (or other specified license) then refer to the Licence for details of permitted re-use. It is a condition of access that users recog- nise and abide by the legal requirements associated with these rights. If you believe that this work infringes copyright please provide details by email to [email protected] Notice: Please note that this document may not be the Version of Record (i.e. published version) of the work. Author manuscript versions (as Sub- mitted for peer review or as Accepted for publication after peer review) can be identified by an absence of publisher branding and/or typeset appear- ance. If there is any doubt, please refer to the published source. https://doi.org/10.1109/ICUAS.2015.7152342

Upload: others

Post on 22-Jun-2020

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

This may be the author’s version of a work that was submitted/acceptedfor publication in the following source:

Mcfadyen, Aaron & Mejias Alvarez, Luis(2015)Design and evaluation of decision and control strategies for autonomousvision-based see and avoid systems.In Sabatini, R, Theilliol, D, & Saripalli, S (Eds.) Proceedings of the 2015International Conference on Unmanned Aircraft Systems, ICUAS.IEEE, United States of America, pp. 607-616.

This file was downloaded from: https://eprints.qut.edu.au/85097/

c© Consult author(s) regarding copyright matters

This work is covered by copyright. Unless the document is being made available under aCreative Commons Licence, you must assume that re-use is limited to personal use andthat permission from the copyright owner must be obtained for all other uses. If the docu-ment is available under a Creative Commons License (or other specified license) then referto the Licence for details of permitted re-use. It is a condition of access that users recog-nise and abide by the legal requirements associated with these rights. If you believe thatthis work infringes copyright please provide details by email to [email protected]

Notice: Please note that this document may not be the Version of Record(i.e. published version) of the work. Author manuscript versions (as Sub-mitted for peer review or as Accepted for publication after peer review) canbe identified by an absence of publisher branding and/or typeset appear-ance. If there is any doubt, please refer to the published source.

https://doi.org/10.1109/ICUAS.2015.7152342

Page 2: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

Design and Evaluation of Decision and Control Strategies forAutonomous Vision-based See and Avoid Systems

Aaron Mcfadyen1 and Luis Mejias2

Abstract— This paper details the design and performanceassessment of a unique collision avoidance decision and controlstrategy for autonomous vision-based See and Avoid systems.The general approach revolves around re-positioning a collisionobject in the image using image-based visual servoing, withoutestimating range or time to collision. The decision strategythus involves determining where to move the collision object,to induce a safe avoidance manuever, and when to ceasethe avoidance behaviour. These tasks are accomplished byexploiting human navigation models, spiral motion properties,expected image feature uncertainty and the rules of the air.The result is a simple threshold based system that can betuned and statistically evaluated by extending performanceassessment techniques derived for alerting systems. Our resultsdemonstrate how autonomous vision-only See and Avoid sys-tems may be designed under realistic problem constraints, andthen evaluated in a manner consistent to aviation expectations.

I. INTRODUCTION

Unmanned Aircraft Systems (UAS) represent an importantfuture technology, with the ability to augment or replacemanned aircraft for a range of commercial applications [1].As many industry sectors rapidly embrace the technology andfurther diversify the application base, there is an increaseddemand to allow unmanned aircraft regular access to unre-stricted civilian airspace. Integrating unmanned aircraft intosuch a complex and structured environment is not trivial, andcreates a set of challenging technical, regulatory and socialissues that remain unresolved [2].

The most restrictive, and arguably most important issue isthe lack of automated See and Avoid systems for unmannedaircraft [3]. This capability refers to the reactive short-termcollision avoidance used by pilots in response to unplannedhazards such as terrain, conflicting aircraft or weather. Itis an uncooperative approach primarily relying on the pi-lots’ visual system, human collision avoidance behaviour,recollection of regulatory procedures and experience [4],[5]. Automating such a unique collision avoidance processand demonstrating an equivalent performance level to thatof manned aircraft, presents a set of difficult problems [6].Challenges remain regarding the design of detection, decisionand control algorithms, as well as the performance evaluationtechniques used to assess their utility and safety.

Regarding system design, a natural choice for object detec-tion and tracking is the use of passive electro-optic devices[7]. They offer a lightweight, low-cost sensing solution fora range of unmanned aircraft regardless of size, weightand power limitations. A major drawback however is the

1,2 are with the Science & Engineering Faculty, Queensland University ofTechnology, Brisbane, Australia. [email protected]

Fig. 1. Example image taken from an unmanned aircraft () in a near-missencounter with a Cessna 172R, where r ≈ 1500m and tcpa ≈ 15s.

limited amount of state information that can be reliablyestimated. As distant aircraft appear as small, low-contrastslow moving point features in the image until seconds beforecollision (see Fig. 1), only relative angular measurementscan be reliably obtained [8], [9]. Estimating relative positionand velocity is considerably more complex for a numberof reasons. The limited avoidance time available, unknownobject motion (and size) and large relative geometries (scale)create significant observability issues that can cause difficul-ties for techniques leveraging passive ranging [10], visuallooming [11], [12] or stereo vision [13]. Without relativeposition and velocity, determining safe avoidance maneuversis then significantly challenging. The decision process isfurther complicated when considering that predictability isimportant for other airspace users. Automating the processthen requires balancing expectations on sensor limitationswith aviation procedures [14]. To this end, the followingcontributions are proposed in this paper:I A detailed analysis of a collision avoidance and resolu-

tion decision strategy that explicitly considers a realisticoperational environment, visual sensing limitations andexisting aviation procedures (rules of the air).

I Introduction of a novel system tuning and performanceevaluation technique for autonomous vision-based Seeand Avoid decision strategies.

I First known attempt at statistically evaluating decisionstrategies explicitly designed for automated vision-basedSee and Avoid systems.

This paper is structured as follows. Section II provides aliterature review of collision avoidance decision strategiesand evaluation techniques in the context of See and Avoid.Section III describes the proposed collision avoidance strat-egy with emphasis on the decision aspects. The statisticalperformance evaluation method is then presented in SectionIV, and used to assess the proposed decision strategy. Con-clusions and further work are offered in Section V.

Page 3: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

II. BACKGROUND

A. Decision Strategies

An automated collision avoidance decision strategy refersto the way in which the available state information is used tomake choices aimed at resolving conflict. It involves decidinghow to avoid the object (avoidance decision), implementingany prescribed action, and then deciding when to ceasethe avoidance behaviour (resolution decision). For vision-based See and Avoid systems, only relative state informationobtained from imaging sensors can be used for each process.

If accurate relative position and velocity estimates could beacquired from image sequences, a large number of existingdecision strategies used in robotics [15] and aviation could beconsidered [16]. These include approaches leveraging con-flict probability (risk) estimates [17], dynamic programmingand markov decision processes (MDP) [18], path planning[19], potential fields [20], and geometric optimisation suchas collision cones and velocity obstacles [21]. Many of thesecan guarantee collision avoidance in the nominal case, withsome approaches facilitating the simultaneous optimisationof multiple objectives such as miss distance, course deviation[22] and observability [23]. Although appealing, applyingsuch strategies requires some rather optimistic or incorrectassumptions regarding the object behaviour and attributes,operational environment or sensor technology.

Assuming that only visual cues such as image position,velocity, shape and size can be reliably estimated fromimage sequences, then designing decision strategies is morerestrictive. One approach is to formulate the problem in animage-based visual control framework, allowing the directuse of visual cues for decision and control. Such approachesbetter resemble See and Avoid behaviour, with many designsoften inspired by insect and human navigation models.

Insects express a variety of different navigation be-haviours. During the day, insects avoid collisions by assess-ing the optic flow field patterns created by nearby objects.The concept has been successfully applied to unmannedaircraft for large static collision objects [24], [25]. Decisionsare qualitative in the sense that a scaled direction commandis issued, without a specific aiming (reference) image point[26]. During the night, insects navigate by fixating the moonslight rays at a constant position in the eye, which causes themto inadvertently avoid or collide with point light sources viaa spiral trajectory [27]. This concept has been used implicitly[28], [29] and explicitly [30], [31] for avoidance of variablesized static objects. Dynamic objects were then consideredfor a subset of encounter types [32], [33]. Decisions arequantitative in the sense that a specific aiming (reference)image point is selected.

Humans tend to rely on relative angular observations whenmaking avoidance decisions. Collisions are identified by azero angular rate or a fixed angular position [34]. Collisionavoidance is then achieved by adopting an anticipatory orpredictive avoidance approach that forces a non-zero angularrate [35]. Decisions are again qualitative (up, right, speedup etc.) but can be influenced by prior knowledge [36]. For

example, the decision to pass in front or below may be guidedby experience [37], [38], or specific right-of-way rules suchas those used at sea or in aviation [39]. Directly using angularrate to discriminate between See and Avoid encounters ina binary classification scheme has been investigated withvaried results [18], [40]. Although effective at avoidingcollisions, many non-collision encounters were incorrectlyclassified leading to unnecessary avoidance action [18]. Thisis likely due to the uncertainty on the visual observations.One way to remodel this approach is to assume an action willalways be taken, whereby the specific action depends on thecertainty or confidence in the visual observations. This meansprecautionary avoidance actions may then be adopted, whichhas also been observed in human See and Avoid behaviour[36], [37].

B. System Performance Evaluation

Performance evaluation for automated collision avoidancesolutions generally refers to the process used to measure theutility and safety of the system. In the context of vision-basedSee and Avoid systems, this involves assessing the detection,decision and control system components individually andcollectively. To ensure strict safety standards are maintained,performance evaluation methods should attempt to alignwith existing aviation processes. This means each systemcomponent may require rigorous testing in simulation andthe intended operating environment, in an attempt to providea comprehensive statistical performance analysis. Given therelative immaturity and diversity of proposed vision-basedSee and Avoid systems, a unified performance evaluationframework does not yet exist.

For detection systems, it is common to use principles fromsignal detection theory such as Receiver Operating Curves[41] to simultaneously visualise trade-offs in performanceand design parameter selection. This allows system parame-ters, such as threshold placement, to be tuned in simulationand refined empirically via analysis of false alarm and correctdetection statistics [42]. Other metrics such as the initialdetection distance can be included, but are generally lessuseful when considering detection consistency [43].

For decision and control, performance is often assessedthrough stability guarantees, practical demonstrations (oftenon scaled down platforms) or in simulation studies. Metricssuch as miss distance are commonly used, and trials areoften restricted to a subset of encounters or under nominaloperating conditions. Although useful, a statistical perfor-mance evaluation approach would offer a more completeassessment. As most decision strategies require parametertuning, it would then make sense to use similar performanceevaluation techniques to that used for detection. An attemptwas made for alerting systems and then used to success-fully evaluate existing aircraft collision avoidance systems(TCAS) [44]. Unfortunately, the approach cannot be applieddirectly to completely autonomous systems, vision-based orotherwise, so would require adaptation. If modified, sucha framework would be useful for analysing and designingdecision strategies solely based on assessing image feature

Page 4: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

behaviour. The approach may also help to evaluate humannavigation models and pilot See and Avoid behaviour. In thiswork, we combine the above findings and propose:I A collision avoidance decision strategy based on assessing

visual cues to re-position the object on the image surfaceusing image-based visual control. Reference image posi-tions are selected using the properties of spiral motion,aviation right-of-way rules and the expected uncertaintyon image feature estimates.

I An initial investigation into a statistical performance eval-uation approach for vision-based See and Avoid systems,by extending existing techniques for alerting systems tofully autonomous system. The approach is then used tooptimise the proposed decision strategy, visualise perfor-mance and identify difficult collision encounter types.

III. COLLISION AVOIDANCE & RESOLUTION

A. Avoidance Maneuvers

Conical spiral trajectories are used as the basis for collisionavoidance maneuvers. They describe the set of trajectoriesthat circumscribe the surface of a cone, and are parametrisedby a fixed velocity v and constant elevation β ∈ 0, π andbearing α ∈ −π, π angle to the apex [27]. Dependingon the reference conical angles c∗(β∗, α∗), divergent orconvergent spirals (planar, ascending or descending) can befollowed, with circular motion as a special case. Previouswork has shown conical spirals to be a useful avoidancemanuever for static objects, provided

|α∗| ≥ π/2, β∗ 6= π/2 (1)

The result is circular or divergent motion about the apex. Inthe notion of reducing course deviation, circular motion isthen preferred [30], [31]. Collision avoidance of constant ve-locity objects is also possible by attempting to establish andtrack a conical spiral such that α∗ = ±π/2 and β∗ 6= π/2[33]. Even if the reference spiral cannot be achieved for fasterobjects, attempting to establish and maintain it can result insafe avoidance behaviour. As evidence, consider an objectmoving with constant velocity vt and constant heading in thexy-plane. An aircraft with velocity v = 1m/s then attempts tofollow a reference spiral such that c∗(π/2,−π/2). Considertwo cases in which the object moves with vt > v andvt < v. For each case, the object adopts an initial relativeheading α0 ∈ 0, π/4, π/2, 3π/4, π, 5π/2, 3π/2, 7π/4 andthe aircraft is initially re-positioned to ensure α = −π/2.The resulting aircraft trajectories are shown with respect tothe object frame in Fig. 2.

For vt < v the aircraft tracks the reference conical angles,resulting in distorted spiral trajectory. For π/2 ≤ α0 ≤ 3π/2(black, grey), the aircraft initially moves away from theobject following a safe avoidance maneuver. It is not untilthe aircraft attempts to overtake and pass in front of theobject that a potential collision may occur. For α0 < π/2or α0 > 3π/2 (red) the aircraft initially moves toward theobject, such that an unsafe avoidance maneuver is adopted.

For vt > v the aircraft cannot track the reference conicalangles for all object headings, and no spiral trajectory is

−2.5 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2

−2.5

−2

−1.5

−1

−0.5

0

0.5

1

X (m)

Y (

m)

Example Spiral Trajectories

(a) vt < v = 0.5m/s

−6 −5 −4 −3 −2 −1 0 1

−4

−3

−2

−1

0

1

X (m)

Y (

m)

Example Spiral Trajectories

(b) vt > v = 1.5m/s

Fig. 2. Example spiral trajectories displayed in the object frame Fo fora reference spiral c(π/2, −π/2). The initial object position in the worldframe Fw is (1,1) and the aircraft is displaced by 1m (,,). The objectposition (+) and safe (−/−) and unsafe (−) aircraft trajectories are shown.

adopted. However, for π/2 ≤ α0 ≤ 3π/2, attempting totrack the reference spiral initially moves the aircraft awayfrom the object, as the correct avoidance direction is adopted.Again, for α0 < π/2 or α0 > 3π/2, the aircraft initiallymoves toward the object by adopting an unsafe avoidancedirection. Importantly, the faster object has greater influenceon the encounter geometry than the slower object.

The above analysis demonstrates spiral trajectories as aviable collision avoidance trajectory, with the difficulty re-siding in the determination of the reference spiral orientation(α = ±π/2). With unknown heading, it is impossible toanalytically determine the optimal spiral direction to adopt.However, an avoidance decision is still required to determinewhich exact spiral to track for an arbitrary encounter. Thisambiguity is addressed in the following section.

Remarks: Simply moving left or right can result in anappropriate avoidance action, but it does not provide thesame desirable qualities offered by attempting to follow aspiral path. First, the conical angles can be identified froma single point feature in the image, and tracked using variousimage-based visual control schemes. Second, the curvedspiral trajectories force the aircraft back toward the originalheading after avoidance, minimising unnecessary action.

Page 5: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

−𝝅 𝟐 𝝅 𝟐

𝝅

𝟐

−𝝅 𝝅

𝝅

𝟎

𝝈

𝜸

𝒔

Uncertainty in current image feature measurements

Hazard Space𝐷

𝒔

Safe Space

𝐷

Safe Space

𝐷

𝒔

Fig. 3. Example planar representation of the spherical imaging surface,including an example object (•), bounded uncertainty (· · ·) on the imagefeature position ξs and danger area D ().

B. Avoidance Decision

Consider a point image feature position s(σ, γ) and im-age feature rate s(σ, γ), where σ and γ denote de-rotatedcolatitude and azimuth angles measured from the imagecentre. Consider now a new metric s derived from the imageobservations that can be used to represent the relative featureconvergence, divergence or lack thereof such that

s = di(s) sT (2)

where di(·) denotes a diagonal matrix. A strongly divergingobject moving away from the image centre will mean thats 0. A strongly converging object moving toward theimage centre will mean that s 0. A relatively stationaryobject, that may exhibit some movement, means that s ≈ 0.

Consider also some uncertainty vectors ξs and ξs for theimage feature observations, whose elements consists of thevariance ξ1 on the image feature angular positions and ratessuch that ξs = (ξσ ξγ) and ξs = (ξσ ξγ). The variablescan be used to denote uncertainty ellipses about the imagefeatures with respect to position and velocity such that

s di(ξs) sT = 0, s di(ξs) sT = 0 (3)

In a similar fashion as above, consider an uncertainty thresh-old η = (ησ ηγ)T such that

η = di(ξs) ξTs (4)

Now consider comparing the feature behaviour against apositive avoidance threshold at the initial confirmed detectioninstant td according to

s(td) 5 ηT , η = 0 (5)

The comparison evaluates the objects image behaviour whilstconsidering the expected measurement uncertainty. It helpsqualitatively distinguish between the actual object behaviourand that induced by noise. To explain, consider the nominalcase with perfect sensing and setting η = 0. Evaluating ifs(td) = ηT determines if a stationary object is in the centreof the image, or a dynamic object is indeed stationary in theimage. These are the conditions known to lead to collision.For imperfect sensing, setting η > 0 and assessing if s(td) 5

1Although unconventional, the symbol ξ is used instead of σ2 to denotevariance to avoid confusion with the colatitude angle σ.

ηT suggests the object is either relatively stationary in theimage with arbitrary motion bounded by ηT , or very close tothe image centre. In this case, the object may be considereda more significant collision threat than if s(td) > ηT . If thethreshold is large such that η 0, then almost all stationary,converging or diverging objects would be considered a majorcollision threat using the same assessment.

If the avoidance threshold is then used to denote theconfidence (of variance) in the visual observations based onexpected uncertainty, it represents a single parameter thatcan be tuned based on the degree of conservativeness that isdesired. If large, the implication is that the camera is perhapsof lower quality or the ambient conditions are causing diffi-culties in object detection and tracking. If small, the oppositemight be implied. Alternatively, the threshold value maybe set based on the cameras measured performance duringcalibration, or updated during flight. The aforementionedvariables are depicted in Fig. 3.

Once the object motion has been qualitatively assessed us-ing the avoidance threshold, an appropriate reference imagefeature position s∗(σ∗, γ∗) is then selected. Provided that theimage features are de-rotated in pitch and roll, the cameraangles approximate the conical angles such that

s∗(σ∗, γ∗) ≈ c∗(β∗, α∗) (6)

and so implicitly define a reference conical spiral [31]. Re-call, determining the polarity of the reference image features(spiral direction), is impossible without knowledge of theobject heading and velocity. Aviation flight rules, and inparticular the right-of-way rules [39], [33], are therefore usedas a convenient baseline solution to help resolve the ambigu-ity. The resulting logic for lateral and vertical avoidance aregiven by algorithm 1 and 2 (Appendix I). To help describethe algorithms, the avoidance decisions for some examplecases shown in Fig. 4 and Fig. 5 are described below.B Example 1: For σ1 > ησ , the object is diverging in the top

half plane. It is likely a non-crossing, non-collision object(either static or dynamic). The object is then allowed topass above the aircraft by setting σ∗ = π/2 + σ0. Forσ2 > ησ , a similar situation occurs, but the object is nowconverging. The object is then allowed to cross in frontof the aircraft by setting σ∗ = π/2− σ0.

B Example 2: For σ3 ≤ ησ and σ4 ≤ ησ , the objectsare relatively stationary in the top half plane. They arelikely collision objects (dynamic), but no right-of-wayrules exist for the vertical dimension in non-overtakingencounters. The objects are forced to pass in front bysetting σ∗ = π/2± σ0 accordingly.

B Example 3: For γ1 and γ2, the object appears behindthe aircraft and is not a collision object regardless of itsbehaviour. This is because it is no longer the primaryresponsibility of the aircraft. For γ3 > ηγ , the objectis diverging in the right half plane. It is likely a non-crossing, non-collision object (either static or dynamic).The object is then allowed to pass to the right of theaircraft by setting γ∗ = π/2. For γ9 > ηγ the object is

Page 6: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

−𝝅 𝟐 𝝅 𝟐

𝝅

𝟐

−𝝅 𝝅

𝝅

𝟎

𝝈

𝜸

Hazard Space𝐷

Safe Space

𝐷

Safe Space

𝐷 𝝈𝟏

𝝈𝟐

𝝈∗

𝝈∗

(a) Example 1

−𝝅 𝟐 𝝅 𝟐

𝝅

𝟐

−𝝅 𝝅

𝝅

𝟎

𝝈

𝜸

Hazard Space𝐷

Safe Space

𝐷

Safe Space

𝐷 𝝈𝟑

𝝈𝟒

𝝈∗

𝝈∗

(b) Example 2

Fig. 4. Example collision avoidance cases. The image feature position(•) is shown, along with σ (−→) and the resulting avoidance decision (orlogic) outcome with respect to the colatitude reference σ∗ ().

converging in the left half plane. It is likely a crossingnon-collision object (dynamic). For γ10 > ηγ , the objectis diverging in the left half plane. It is likely a non-crossing, non-collision object (either static or dynamic).In both cases however, applying the right-of-way rulesmeans the object is required to pass behind the aircraftby setting γ∗ = −π/2.

B Example 4: For γ4 ≤ ηγ and γ5 ≤ ηγ , the object isrelatively stationary in the right half plane. It is likelya crossing collision object (dynamic), and right-of-waymust be given. The object is then allowed to pass in frontby setting γ∗ = −π/2. For γ6 ≤ ηγ , the object maybe static and directly in front of the aircraft, or dynamicand just prior to collision. This constitutes a near-headon encounter, so the aircraft must turn right. As such,the object is allowed to pass in front by setting γ∗ =−π/2. For γ7 ≤ ηγ and γ8 ≤ ηγ , the object is relativelystationary in the left half plane. It is likely a crossingcollision object, and right-of-way must be given. In thiscase however, the aircraft has right-of-way which meansthe object is required to pass behind. As such, the objectis forced to pass behind by setting γ∗ = −π/2.

Remarks: Action is always taken with decoupled lateraland vertical avoidance decisions resulting in a mixture of theabove cases. Decisions are intended to be more predictablefrom a pilots’ perspective instead of guaranteeing collisionavoidance or geometric optimality. Additionally, if the objectemploys the same avoidance strategy, the resulting motionis complementary. For example, if the object observes theaircraft on the right, the object gives way and the aircrafthas right-of-way such that γ∗ = −π/2 for both platforms.

−𝝅 𝟐 𝝅 𝟐

𝝅

𝟐

−𝝅 𝝅

𝝅

𝟎

𝝈

𝜸

Hazard Space𝐷

Safe Space

𝐷

Safe Space

𝐷

𝜸

𝜸𝟐

𝜸𝟑

𝜸𝟏

𝜸∗

𝜸𝟗

𝜸𝟏𝟎

𝜸∗

𝜸∗

(a) Example 3

−𝝅 𝟐 𝝅 𝟐

𝝅

𝟐

−𝝅 𝝅

𝝅

𝟎

𝝈

𝜸

Hazard Space𝐷

Safe Space

𝐷

Safe Space

𝐷 𝜸𝟓

𝜸𝟒

𝜸𝟔

𝜸∗

𝜸∗

𝜸𝟕

𝜸𝟖

𝜸∗

𝜸∗

(b) Example 4

Fig. 5. Example collision avoidance cases. The image feature position (•)is shown, along with γ (−→) and the resulting avoidance decision (or logic)outcome with respect to the reference azimuth angle γ∗ ().

C. Avoidance Control

Spherical image-based visual servoing is used to track thereference image features. As only a single point is observed,only two degrees of freedom can be directly controlled fromthe visual feedback. To remain applicable to all platforms(rotary and fixed wing), only vertical velocity (or position)and yaw rate (or roll angle) are visually controlled. Theremaining degrees of freedom can be controlled separatelyusing a series of PID and LQRI controllers to maintaina fixed forward speed. The result is a partitioned controlapproach in which the image-based control can be cast in aclassical or predictive (optimal) control framework.

Using classical methods, the control vector uz is found byassuming an exponential decrease in the image feature errore such that

uz(t) = L−1z

(−λe(t)− Lxyvxy(t)

)(7)

where Lz and Lxy represent the z and xy axis components ofthe approximate spherical image Jacobian respectively. Thevxy term denotes the translational and rotational velocitiesabout the x and y axis, and λ is a constant gain term.The image Jacobian is approximate in the sense that it isparametrised by a fixed reference range value. This is notrestrictive as stable, conservative avoidance trajectories canbe obtained by overestimating its value and assuming amoderate gain term [30].

Using predictive methods, an optimal control sequenceu∗z(·) is found by minimizing a cost function Js subject tocontrol, platform and sensing (visibility) constraints such that

u∗z(·) = argminU

Js(z(t),uz(·)) (8)

Page 7: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

where U defines the control constraint domain, and z de-notes a mixed state consisting of both image features andvehicles states. The optimisation problem is solved over afinite time horizon using standard nonlinear model predictivecontrol approaches. The approach offers greater flexibilitythan classical methods, allowing the explicit considerationof platform dynamics and associated constraints, whilst re-maining aligned to anticipatory human navigation models.Stability-based designs for spiral motion also exist [31].

For both control approaches, an integral term can be addedto the control at each iteration to help account for modelmismatch and added uncertainty. For dynamic objects, thishelps compensate for the generally unknown object motion.Using predictive control, the augmented control u?z is then

u?z(t) = u∗z(t) + λi

∫ w

e(t) dt (9)

where λi is a fixed integral gain term and w denotes anarbitrary time window. These parameters should be selectedin conjunction with the control constraint domain to ensurestable control. As control is not the focus of this work, weleverage past designs using the predictive approach outlinedabove, but include integral action (9). A detailed outline ofthe controller including all parameters, dynamic models andthe image Jacobian structure can be found in [32].

D. Resolution Decision

A range independent heading-based criteria is used toindicate an appropriate time to stop the avoidance. As theplatform avoids the object using a spiral path, it inadver-tently attempts to return to its original heading upon initialavoidance action ψ(td). Ceasing spiral motion when ψ(t) =ψ(td) then stops the avoidance behaviour before the pointof minimum separation, with the aircraft displaced from itsoriginal path and re-tracking its initial heading. The amountof time before the point of minimum separation dependson the maneuvering required to establish the spiral. Thisdepends on the closing velocity, or difference between thereference azimuth and relative heading.

The larger the closing velocity, the less the aircraft ma-neuvers to establish the spiral and then return to its initialheading. This is because the object motion now reinforces theaircraft’s attempt to establish the spiral. Once established, theaircraft must turn quickly to maintain the spiral, decreasingthe time spent tracking the spiral before re-establishing theinitial heading. So for a given relative velocity, the largerthe difference between the reference azimuth and relativeheading, the closer the stopping time is to the point ofminimum separation. The effect is seen in Fig. 6(a) forvt < v, where the closing velocity increases as the relativeheading approaches a head on encounter (ψ(td) = 3π/2).The result is further amplified in Fig. 6(b) where vt > vand the stopping instance tends toward the time of minimumseparation for all ψ(td).

An effective resolution decision strategy can then be de-signed based solely on monitoring the platform heading. Thesimplest approach is stop the avoidance when ψ(t) = ψ(td),

−8 −6 −4 −2 0 2 4 6 8−6

−4

−2

0

2

4

6

X (m)

Y (

m)

Example Spiral Resolution

(a) vt < v

−25 −20 −15 −10 −5 0 5 10−12

−10

−8

−6

−4

−2

0

2

4

X (m)

Y (

m)

Example Spiral Resolution

(b) vt > v

Fig. 6. Example resolution cases displayed in the object frame Fo for areference spiral s∗ = c∗(π/2, −π/2). The object initial position in theworld frame is (1,1) and the aircraft is initial displaced by 1m (,). Theaircraft position (−/−−/−) and corresponding resolution instance (//•/)are given for ψ(td) ∈ 3π/2, 5π/3, 11π/6 respectively.

but this assumes perfect state information so will fail inthe presence of noise. If |ψ(td + δt)| < |ψ(td)| due toturbulence or poor sensor quality, avoidance will be stoppedprematurely resulting in a potentially unsafe situation [30]. Abetter approach is to wait until the reference image featuresare tracked, and then place a small positive threshold ε aboutthe reference heading such that avoidance is stopped when

ψ(td)− ε < ψ(t) < ψ(td) + ε (10)

Although coupled to the visual control, there is no way tomeasure the degree to which each competing objective isaccomplished. To do this, a cost function Jψ external to thevisual control could be used to provide an explicit trade-offin obtained the reference spiral and stopping the avoidance.Thresholding the minimum value J∗ψ could be considered aproxy to continuously assessing how well both objectives aresatisfied. The resulting control u??z can be expressed as,

u??z (t) =

uz(t), ∀ J∗ψ, t < td ∨ ts ≤ t <∞u?z(t), J∗ψ > ε, td ≤ t < ts

(11)

Page 8: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

MD Conflict Region

IC

CAIR

CACR

PACR

PAIR

Safe Region

Fig. 7. Automated collision avoidance outcomes

where uz denotes non-visual control. This is the approachtaken in this paper, with details of the cost function used forresolution decisions provided in [33].

IV. SYSTEM EVALUATION

The evaluation technique extends existing probabilisticapproaches used in alerting systems and signal detectiontheory. The framework offers a means to simultaneouslyasses system performance and optimise the parameters andthresholds (η, ε) used in the decision and control strategy.The framework requires the identification of an augmentedset of collision outcomes, and the development of associatedstatistical performance metrics that can be visualised usingderivatives of receiver or system operating curves.

A. Collision Avoidance Outcomes

The specific outcome of a potential collision encounter isa function of the relative geometry, minimum safety marginsand the complex interactions between collision avoidancesystem parameters. It is then intractable to enumerate eachunique outcome, so the resulting collision state is categorisedaccording to a discrete set of outcomes types.

A simple binary classification consisting of collision ormiss outcomes, does not provide sufficient granularity toaccount for the inclusion of the reactive fully automatedcollision avoidance system. The outcome categories need toconsider that action is always taken, albeit different basedon the avoidance decision, and that the action needs to bestopped. Just because an avoidance action was taken, doesnot ensure that the avoidance behaviour was stopped and thecollision was resolved. The encounter outcome categories aretherefore extended to include

PACR : Precautionary Avoidance & Correct ResolutionPAIR : Precautionary Avoidance & Incorrect ResolutionCACR : Correct Avoidance & Correct ResolutionCAIR : Correct Avoidance & Incorrect ResolutionMA : Missed AvoidanceIC : Induced Collision

where precautionary action is associated with non-collisionencounters. PACR results when precautionary action is taken

0

Modified System Operating Curves (MSOC)

P(+

) =

P(C

AC

R)

+ P

(CA

IR)

+ P

(PA

CR

)

P(-) = P(MA) + P(IC)

Achievable Operating Region

1

1

Line of No Benefit

Operating Point 𝜀1, 𝜂1

Operating Point 𝜀2, 𝜂1

Operating Point 𝜀2, 𝜂2

Fig. 8. Example Modified System Operating Curve

and then successfully stopped. PAIR results when precau-tionary action is taken but is incorrectly (or unsuccessfully)stopped. CACR results when collision avoidance action isrequired and taken, and then successfully resolved. CAIRresults when collision avoidance action is required and taken,but then incorrectly resolved. MA results when a collisionwas present, but not avoided. IC results for non-collisionencounters in which action induced a collision. For MA andIC, resolution outcomes do not make sense. The collisionavoidance outcome categories are depicted in Fig. 7.

B. Modified System Operating Curves

A modified system operating curve is constructed using theobserved counts of each outcome type for a large number ofencounter scenarios under different system parameters. Theplot consists of two axis, one denoting the probability of adesirable outcome P (+) and the other the probability of anundesirable outcome P (−) where

P (+) = P (CACR) + P (CAIR) + P (PACR) (12)P (−) = P (MA) + P (IC) (13)

Each individual probability or statistical performance metricis calculated over the total number of collision N and non-collision encounters M . For example,

P (CACR) =#CACR

N, P (PAIR) =

#PAIRM

, ... (14)

Of note, P (CAIR) is considered desirable as the primaryconcern is to avoid collision. The undesirable outcomeP (PAIR) is excluded, but can be included if required.

For a given set of controller parameters, decision thresh-olds (η, ε) and noise characteristics, the above metrics (12)-(13) are determined and plotted as a single operating point.The system parameters are then altered and the metrics arere-evaluated to obtain a set of points, resulting in a curve. Thegoal being to move toward the fictitious ideal operating pointin the upper left corner. At this point, no collisions occurand avoidance action is always stopped for both collisionand non-collision encounters.

To isolate the decision strategy, the controller and noiseparameters are fixed and only the avoidance and resolution

Page 9: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

thresholds are varied. The effect of each decision thresholdon system performance can then be visualised and subse-quently optimised accordingly, as the thresholds are mutuallyexclusive and decoupled. An example of such a modifiedsystem operating curve is shown in Fig. 8.

C. Simulation Architecture

The avoidance decision, avoidance control and resolutiondecision strategies are combined in a simulation architec-ture developed in MATLAB, and using external optimisa-tion routines [45]. A small quadrotor platform is assumed,whereby a simplified point mass dynamic model is used forthe predictive controller and a realistic empirically deriveddynamic model [32] is implemented during simulation. Thelinearised object motion model expresses itself only as apoint image feature. The controller frequency fc is faster thanthe image processing rate fi, with objects detected withinthe range r0. Additive uncertainty includes sensor (imagefeature) noise qs(t), imperfect actuation qc(t), turbulencewg(t) and ambient wind wa(t) (Appendix II).

The avoidance threshold was initially set such that η =(ξσ ξγ), and then varied using a scaling factor λη ∈1/8, 1/4, 1/2, 1, 2, 4, 8. Avoidance thresholds significantlybelow and above the expected image feature uncertaintywhere thus evaluated. The resolution threshold was set suchthat ε ∈ 0.02, 0.05, 0.07, 0.09, 0.15, 0.30. The values arecoupled to the weighting matrix scales used in the cost func-tion Jψ , and represent liberal (ε ≈ 0.30) and conservative(ε ≈ 0) values. For each η and ε, 1000 constant velocitycollision and 1000 non-collision encounters were simulatedin a scaled-down See and Avoid environment (1:100). Giventhe control constraints, nominal collision boundary rc =0.25m and initial geometry, no encounters were consideredwhereby avoidance would be infeasible for any controller.

D. Results

Fixing the avoidance threshold (λη = 1), the resolutionperformance was first evaluated. System performance wasacceptable, and was significantly influenced by the resolutionthreshold value. The operating points move toward the idealoperating point as the threshold increases, before divergingwhen the threshold is increased such that ε > 0.15. The resultarticulates the tradeoff between maintaining the referencespiral and ceasing avoidance behaviour. For small thresholds,resolution is conservative as it becomes difficult to meet thethreshold value (Jψ > ε, ∀ t). The platform is reluctant toleave the spiral path, further encircling the object and provid-ing the opportunity to induce future collisions (P (+) ≈ 70%,P (−) ≈ 2%). As the threshold is increased such that ε >0.15, resolution becomes very liberal as it becomes easier tomeet the threshold value. Avoidance action may be stopped atan earlier instance (Jψ < ε, t ≈ td), so the aircraft may nothave maneuvered enough to avoid a collision. The amountof correct resolutions increases at the expense of increasedcollision risk, moving the curve rightwards away from theideal operating point (P (+) ≈ 90%, P (−) ≈ 5%).

0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.450.5

0.55

0.6

0.65

0.7

0.75

0.8

0.85

0.9

0.95

1

P(−)

P(+

)

Modified System Operating Curve

Fig. 9. Example modified system operating curve for variable ε ∈(0.02, 0.30) (×) and λη ∈ (1/8, 8) values. The set of avoidance thresholdsfor ε = 0.15 is approximated using the region Ωη (//). Three curvesare shown for a collision boundary rc of 0.125 (−), 0.25 (−) and 0.5 (−).

Fixing the resolution threshold (ε = 0.15), the avoidanceperformance was then evaluated. System performance wassurprisingly good (P (+) ≈ 87 − 91%, P (−) ≈ 2%),and less influenced by the avoidance threshold value. Theoperating point locations lack a distinguishable pattern withrespect to changes in η, residing in a small region aboutthe nominal point η for λη = 1. For example, λη = 4shows improved performance, but λη = 8 shows degradedperformance compared to λη = 2. An operating region Ωηmay then be coarsely approximated by finding the maximum2-norm distance dη between the nominal operating point andη ∈ (η/8, 8η). A circular region of radius dη can then beused to approximate the region.

The above result can be interpreted two ways. Either thereis insufficient simulations to draw adequate conclusions, orthere exists encounter geometries that cause problems forthe avoidance approach regardless of the threshold value.The latter may be due to the reliance on right-of-way rulesor difficulties in distinguishing collision and non-collisionobjects under image feature uncertainty. By always actinghowever, performance surpasses that of similar systems usingangular measurements to only avoid collision objects [10].In such systems, both positive and negative outcomes werehigh (P (+) ≈ P (−) ≈ 90%).

Combining the above outcomes, the resulting modifiedsystem operating curves for variable collision boundaries areshown in Fig. 9. Selecting a smaller collision boundary im-proves performance, but the results are open to interpretationsubject to aircraft size. For example, the results suggest thatfor rc = 0.125 the system can be 95% effective at avoidingcollisions between two small aircraft with radius ≤ 0.0625mwith zero separation. Conversely, for rc = 0.5 the systemcan be only 85% effective at avoiding collisions involvingthese same aircraft, but with non-zero separation instead.

As a final result, the performance evaluation strategy canhelp identify the collision geometries which cause the deci-sion strategy to fail. This is done by re-simulating the objecttrajectories in the aircraft reference frame for all collision

Page 10: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

−40 −30 −20 −10 0 10 20 30 40 50

−30

−20

−10

0

10

20

30

40

x (m)

y (m

)

Avoidance Trajectory Feasibility

Fig. 10. Example illustration of all collision trajectories for an encountersset of 2000. Object initial position (/) and trajectories for missed avoid-ance (−) and induced collisions (−) are shown up to the time of collision.The aircraft initial position () without avoidance trajectory is also shown.

encounters (missed avoidance and induced collisions), upuntil the time of minimum separation. This is depicted inFig. 10 for ε = 0.15, λη = 1 and rc = 0.125. The resultsshow that the decision strategy handles head-on encountersvery well, with only a single induced collision and nomissed avoidances. Crossing collision encounters are moredifficult to manage, with object trajectories originating near±90 degrees causing the most difficulty. Similar results arealso obtained for different decision thresholds and collisionboundaries, which provides some valuable insight into theeffectiveness of the right-of-way rules themselves.

Remarks: The results omit the probability of a collisionencounter itself. If included, system performance would ap-pear to improve given the rarity of the event. Instead, theprobability of an encounter is unity with collisions and non-collisions equally likely to avoid misleading the results.

V. CONCLUSIONS

Design and certification of See and Avoid systems is achallenging task that remains in the developmental stage.The approach presented in this paper demonstrates howan effective vision-based decision and control strategy forautomated See and Avoid systems can be designed, tunedand statistically evaluated in a generalised framework. Initialresults suggest the proposed strategy is over 90% effectiveat avoiding collisions, with only a 2-5% chance of a neg-ative outcome. Additionally, the approach could be used toaugment existing collision avoidance approaches.

This work constitutes a particularly unique contributiontoward the progression of automated See and Avoid systems,and provides a good foundation in which to stem furtherresearch and development. This may include augmenting theperformance evaluation approach to account for detectionperformance metrics and the likelihood of specific encountertypes (collision, non-collision, near miss etc.).

ACKNOWLEDGEMENTS

This work was supported by the Institute for FutureEnvironments (IFE) and the Australian Research Centre of

Aerospace Automation (ARCAA) at Queensland Universityof Technology, Brisbane, Australia.

APPENDIX I

Algorithm 1 Avoidance Decision Strategy - Azimuthif γ ∈ D then . Danger Area

if γ < ηγ then . Convergent, Centreline or Static Features

γ∗ = −π/2 . ∗∗Allow Crossing

Right-of-Way Rules, CAR 162(1-4)

else . Divergent Features

if γ < 0 then . Left Centreline

γ∗ = −π/2 . ∗∗No Crossing

else . Right Centreline

γ∗ = π/2 . ∗∗No Crossing

end ifend if . Static or Centerline Features

else . Outside Danger Area

γ∗ = γ . ∗∗No Movement

end if ∗∗ Aircraft Action

Algorithm 2 Avoidance Decision Strategy - ColatitudeSet σ = σ − π/2†, σ∗o = 35π/180‡

if Object Above Horizontal thenif σ > ησ then . Divergent Features

σ∗ = π/2 + σ∗o . ∗∗No Crossing

else if σ < −ησ then . Convergent Features

σ∗ = π/2− σ∗o . ∗∗Allow Crossing

else . Static or Centerline Features

if Overtaking then . Overtaking

σ∗ = σ . ∗∗No Movement, CAR 162(4)

else Not Overtakingσ∗ = π/2− σ∗o . ∗∗Allow Crossing

end ifend if

else Below Above HorizontalReciprocal Logic

end if ∗∗ Aircraft Action

‡σ0 is located in a relatively nonlinear region of the image surface toimprove control performance regarding stability and feasibility [31].

APPENDIX IISimulation & Control Parameters

Parameter Value Unitsfc 40 Hzfi 10 Hzξs (0.04, 0.04) radξs (0.04, 0.04) rad/s

tcpa [5 55]) sr0 [0 40] mvt [−0.5 0.5] m/s

wg(t) N (03, I3) rad/swa(t) N (03, 0.25I3) m/sqc(t) N (02, 0.02I2) radqs(t) N (02, ξsI2) N, rad/sv 0.1 m/sλi 0.01 -w 10/f s

Page 11: c Consult author(s) regarding copyright matterseprints.qut.edu.au/85097/1/icuas2015final.pdf · signal detection theory such as Receiver Operating Curves [41] to simultaneously visualise

REFERENCES

[1] K. Valavanis, “Unmanned aircraft systems: the current state-of-the-art,” Springer, 2013

[2] K. Dalamagkidis, K. Valavanis, and L. Piegl, “On unmanned aircraftsystems issues, challenges and operational restrictions preventingintegration into the national airspace system,” Progress in AerospaceSciences, vol. 44, no. 7-8, pp. 503-519, Oct-Nov. 2008

[3] P. Angelov, “Sense and avoid in UAS: research and application,” JohnWiley & Sons, 2012

[4] Australian Transport Safety Bureau (ATSB), “Limitations of the see-and-avoid principle,” Tech. Report, Canberra, Australia, Nov. 2004

[5] C. Morris “Midair collisions: limitations of the see-and-avoid con-cept in civil aviation,” Aviation, Space, and Environmental Medicine,vol. 76, no. 4, pp. 357-365, April 2005

[6] X. Prats, L. Delgado, J. Ramırez, P. Royo and E. Pastor, “Require-ments, issues, and challenges for sense and avoid in unmanned aircraftsystems,” Journal or Aircraft, vol. 49, no. 3, pp. 677-687, May 2012

[7] J. Lai J. Ford L. Mejias and P. O’Shea, “Characterization of sky-regionmorphological-temporal airborne collision detection,” Journal of FieldRobotics, pp. 171-193, vol. 30, no. 2, March-April 2013

[8] J. Lai, J. Ford, L. Mejias, and P. O’Shea, “Vision-based estimationof airborne target pseudobearing rate using hidden Markov modelfilters,” IEEE Trans. Aerospace and Electronic Systems, vol. 49, no. 4,pp. 2129-2145, Oct. 2013

[9] G. Fasano, D. Accardo, A. Tirri and A. Moccia, “Morphologicalfiltering and target tracking for vision-based UAS sense and avoid,”IEEE Int. Conf. Unmanned Aircraft Systems (ICUAS’14), pp. 430-440,May 2014

[10] O. Shakernia, M. Chen, and V. Raska, “Passive ranging for uav senseand avoid applications,” AIAA Infotech@Aerospace Conf. and Exhibit,pp. 1-10, March 2005

[11] T. Molloy, J. Ford and L. Mejias, “Looming aircraft threats : shape-based passive ranging of aircraft from monocular vision,” AustralianConf. Robotics and Automation, Dec. 2014

[12] H. Choi, Y. Kim and I. Hwang “Reactive collision avoidance ofunmanned aerial vehicles using a single vision sensor,” JournalGuidance, Control and Dynamics, vol. 36, no. 4, pp. 1234-1240, 2013

[13] S. Hrabar, “An evaluation of stereo and laserbased range sensing forrotorcraft unmanned aerial vehicle obstacle avoidance,” Journal ofField Robotics, vol. 29, no. 2, pp. 215-239, 2012

[14] A. Mcfadyen and L. Mejias, “A survey of autonomous vision-basedsee and avoid for unmanned aircraft systems,” Progress AerospaceScience, 2015 (submitted)

[15] M. Hoy, A. Matveev and A. Savkin, “Algorithms for collision-freenavigation of mobile robots in complex cluttered environments: asurvey,” Robotica, pp. 1-35, March 2014

[16] J. Kuchar and L. Yang, “A review of conflict detection and resolutionmodeling methods,” IEEE Trans. Intelligent Transportation Systems,vol. 1, no. 4, pp. 179-189, Dec. 2000

[17] B. Vanek, T. Peni, P. Bauer, and J. Bokor, “Vision only senseand avoid: a probabilistic approach,” American Control Conference(ACC’14),1204-1209, 2014

[18] M Kochenderfer, J. Griffith, and J. Kuchar, “Hazard alerting using line-of-sight rate,” AIAA Guidance, Navigation and Control Conferenceand Exhibit, 18-21 August, 2008

[19] J. Cobano, R. Conde, D. Alejo and A. Ollero, “Path planning basedon genetic algorithms and the monte-carlo method to avoid aerialvehicle collisions under uncertainties,” IEEE Int. Conf. Robotics andAutomation (ICRA’11), pp. 4429-4434, May 2011

[20] J. Ruchti, R. Senkbeil, J. Carroll, J. Dickinson, J. Holt and S. Biaz,“Unmanned aerial system collision avoidance using artificial potentialfields,” Journal of Aerospace Information Systems vol. 11, no. 3pp. 140-144, 2014

[21] A. Alexopoulos, A. Kandil, P. Orzechowski and E. Badreddin, “Acomparative study of collision avoidance techniques for unmannedaerial vehicles,” IEEE Int. Conf. Systems, Man, and Cybernetics(SMC’13), pp. 1969-1974, Oct. 2013

[22] P. Zapotezny-Anderson and J. Ford, “Optimal-stopping control forairborne collision avoidance and return-to-course Flight,” AustralianControl Conference (AUCC’11), pp. 155-160, Nov. 2011

[23] H. Yu, R. Sharma, R. Beard and C. Taylor, “Observability-based localpath planning and obstacle avoidance using bearing-only measure-ments,” Robotics and Autonomous Systems, vol 61, no. 12, pp. 1392-1405, Dec. 2013

[24] W. Green and P. Oh, “Optic-flow-based collision avoidance,” IEEERobotics & Automation Magazine, vol. 15, no. 1, pp. 96-103, March2008

[25] C. Bills, J. Chen and A. Saxena, “Autonomous MAV flight in indoorenvironments using single image perspective cues,” IEEE Int. Conf.Robotics and automation (ICRA’11), pp. 5776-5783, May 2011

[26] A. Beyeler, J. Zufferey and D. Floreano, “Vision-based control of near-obstacle flight,” Autonomous Robots, vol. 27, no. 3 pp. 201-219, 2009

[27] K. Boyadzhiev, “Spiral and conchospirals in the flight of insects,” Thecollege Mathematics Journal, vol. 30, no. 1, pp. 22-31, Jan. 1999

[28] J. Saunders, R. Beard, and J. Byrne, “Vision-based reactive multipleobstacle avoidance for micro air vehicles,” American Control Confer-ence (ACC’09), pp. 5253-5258, June 2009

[29] X. Yang, L. Mejias, and T. Bruggemann, “A 3D collision avoidancestrategy for uavs in an non-cooperative environment,” Journal ofIntelligent Robotic Systems, vol. 70, no. 1-4, pp. 315-327, April 2013

[30] A. Mcfadyen P. Corke and L. Mejias, “Rotorcraft collision avoidanceusing spherical image-based visual servoing and single point features,”IEEE/RSJ Int. Conf. Robotics and Intelligent Systems (IROS’12),pp. 1199-1205 , Oct. 2012.

[31] A. Mcfadyen, P. Corke and L. Mejias, “Visual predictive control ofspiral motion,” IEEE Trans. Robotics, vol. 30, no. 6, pp. 1441-1454,2014

[32] A. Mcfadyen L. Mejias P. Corke and C. Pradalier, “Aircraft colli-sion avoidance using spherical visual predictive control and singlepoint features,” IEEE/RSJ Int. Conf. Robotics and Intelligent Systems(IROS’13), pp. 50-56, Nov. 2013

[33] A. Mcfadyen, A. Durand-Petiteville and L. Mejias, “Decision strate-gies for automated visual collision avoidance,” Proc. Int. Conf. Un-manned Aircraft Systems (ICUAS’14), pp. 715-725, May 2014

[34] B. Fajen and W. Warren, “Behavioural dynamics of intercepting amoving target,” Experimental Brain Research, vol. 180, no. 2, pp. 303-319, June 2007

[35] G. Diaz, F. Phillips and B. Fajen, “Intercepting moving targets: a littleforesight helps a lot,” Experimental Brain Research, vol. 195, no. 3,pp. 345-360, May 2009

[36] I. Koglbauer, R. Braunstingl and T. Haberkorn, “Modeling humanand animal collision avoidance strategies,” Int. Symp. Aviat. Psychol,pp. 554-559, 2013

[37] T. Haberkorn, I. Koglbauer, R. Braunstingl and B. Prehofer, “Require-ments for future collision avoidance systems in visual flight: a human-centered approach,” IEEE Trans. Human-Machine Systems, vol. 43,no. 6, pp. 583-594, Oct. 2013

[38] C. Chauvin and S. Lardjane, “Decision making and strategies inan interaction situation: collision avoidance at sea,” TransportationResearch Part F: Traffic Psychology and Behaviour, vol. 11, no. 4,pp. 259-269, 2008

[39] Civil Aviation Safety Authority, “CASR Part 91, General operatingand flight rules,” Civil Aviation

[40] P. Angelov, C. Bocaniala, C. Xydeas, C. Pattchett, D. Ansell, M. Ev-erett, and G. Leng, “A passive approach to autonomous collisiondetection and avoidance in uninhabited aerial systems,” IEEE Int. Conf.Computer Modelling and Simulation, pp. 64-69, April 2008

[41] C. Metz, “Basic principles of ROC analysis,” Seminars in NuclearMedicine, vol. 8, no. 4, pp. 283-298, Oct. 1978

[42] T. Gandhi,M. Yang, R. Kasturi, O. Camps, L. Coraor and J. Mc-Candless, “Performance characterization of the dynamic programmingobstacle detection algorithm,” IEEE Trans. Image Processing, vol. 15,no. 5, pp. 1202-1214, 2006

[43] J. Lai, J. Ford, L. Mejias, A. Wainwright,P. O’Shea and R.Walker,“Field-of-view, detection range, and false alarm trade-offs in vision-based aircraft detection,” Int. Congress on Aeronautical Sciences(ICAS’12), Sep. 2012

[44] J. Kuchar, “Methodology for alerting-system performance evaluation,”AIAA Journal of Guidance, Control, and Dynamics, vol. 19, no. 2,pp. 438-444, March-April 1996

[45] B. Huska and H. Ferreau and M. Diehl, “ACADO Toolkit - an opensource framework for automatic control and dynamic optimization,”Optimal Control Theory and Methods, vol. 32, no. 3, pp. 298-312,May-June 2011