byu uav team technical design paper for auvsi student uas ... · byu uav team - 1 - byu uav team ....

20
BYU UAV Team - 1 - BYU UAV Team Technical Design Paper for AUVSI Student UAS Competition 2017 Brigham Young University Abstract The BYU UAV Team has designed and built a complete UAS specifically engineered to complete all mission tasks in the 2017 AUVSI Student UAS Competition. The team consists of volunteer students from the departments of Mechanical, Electrical, and Computer Engineering. This technical design paper describes the overall design process and the final design solutions reached by the development team. The ROSplane autopilot, developed by the BYU MAGICC Lab and enhanced by the BYU UAV Team, is one of many innovative systems resulting from this process. The team conducted extensive testing throughout development to ensure all design decisions yielded a safe system capable of fulfilling mission requirements. Descriptions of past and ongoing tests are highlighted in this paper including individual component tests and full mission trials. Safety procedures and risk mitigation methods are also detailed.

Upload: phungnhan

Post on 19-Mar-2019

228 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 1 -

BYU UAV Team Technical Design Paper for AUVSI Student UAS

Competition 2017

Brigham Young University

Abstract The BYU UAV Team has designed and built a complete UAS specifically engineered to complete all mission tasks in the 2017 AUVSI Student UAS Competition. The team consists of volunteer students from the departments of Mechanical, Electrical, and Computer Engineering. This technical design paper describes the overall design process and the final design solutions reached by the development team. The ROSplane autopilot, developed by the BYU MAGICC Lab and enhanced by the BYU UAV Team, is one of many innovative systems resulting from this process. The team conducted extensive testing throughout development to ensure all design decisions yielded a safe system capable of fulfilling mission requirements. Descriptions of past and ongoing tests are highlighted in this paper including individual component tests and full mission trials. Safety procedures and risk mitigation methods are also detailed.

Page 2: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 2 -

Table of Contents

ABSTRACT ...................................................................................................................................... - 1 -

1. SYSTEMS ENGINEERING ................................................................................................... - 3 - 1.1 MISSION REQUIREMENT ANALYSIS .............................................................................................. - 3 - 1.2 DESIGN RATIONALE ..................................................................................................................... - 3 - 1.3 PROGRAMMATIC RISKS & MITIGATIONS ..................................................................................... - 4 -

2. SYSTEM DESIGN ................................................................................................................. - 5 - 2.1 AIRCRAFT ..................................................................................................................................... - 5 - 2.2 AUTOPILOT ................................................................................................................................... - 6 - 2.3 OBSTACLE AVOIDANCE ................................................................................................................ - 9 - 2.4 IMAGING SYSTEM ....................................................................................................................... - 10 - 2.5 TARGET DETECTION, CLASSIFICATION, LOCALIZATION ............................................................ - 11 - 2.6 COMMUNICATIONS .................................................................................................................... - 14 - 2.7 AIR DELIVERY ............................................................................................................................ - 14 - 2.8 CYBER SECURITY ....................................................................................................................... - 14 -

3. TEST & EVALUATION PLAN ........................................................................................... - 15 - 3.1 DEVELOPMENTAL TESTING ........................................................................................................ - 15 - 3.2 INDIVIDUAL COMPONENT TESTING ............................................................................................ - 15 - 3.3 MISSION TESTING PLAN ............................................................................................................. - 17 -

4. SAFETY, RISKS, & MITIGATIONS .................................................................................. - 18 - 4.1 DEVELOPMENTAL RISKS & MITIGATIONS.................................................................................. - 19 - 4.2 MISSION RISKS & MITIGATIONS ................................................................................................ - 19 - 4.3 OPERATIONAL RISKS & MITIGATIONS ....................................................................................... - 20 -

5. CONCLUSION ................................................................................................................... - 20 -

REFERENCES ................................................................................................................................ - 20 -

Page 3: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 3 -

1. Systems Engineering 1.1 Mission Requirement Analysis After not competing in the AUVSI SUAS competition for 10 years, the 2017 BYU UAV Team was tasked with creating a new platform from the ground up. To complete the competition tasks, sub-teams were formed corresponding to the four main systems that needed to be developed: Airframe, Autopilot, Imaging, and Interoperability. With limited development time as the biggest obstacle, the members of each sub-team defined critical and secondary tasks related to their system. Throughout development, special emphasis was placed on these critical tasks in order to maximize scoring potential during the Mission Demonstration. The tasks and their corresponding level of priority, derived from this analysis, were as follows:

Critical Tasks: Autonomous Flight, Stationary Obstacle Avoidance, Manual Target Detection, Classification, and Localization, and Air Delivery Secondary Tasks: Moving Obstacle Avoidance, Autonomous Target Detection, Classification and Localization

1.2 Design Rationale

Table 1 – Rationale system for hardware selection System Options Considered Product Chosen / Rationale Imaging Sony EV7500

Hitatchi DI-SC 120R Sony EV7500

- Small, light-weight - Fast shutter to minimize motion blur - 30x adjustable zoom

On Board Computer

Brix i5 Odroid xu4 Raspberry Pi 2 B

Brix i5 - Operates well with Ubuntu 16.04, the OS on which

ROSplane and peripheral applications are built - i5 processor allows both flight estimation and image

processing to happen onboard - Possesses all necessary USB and Ethernet ports

Data Link Ubiquiti PicoStation 3DR Radio

Ubiquiti PicoStation - Long-range communication capabilities at a high

bandwidth - No need to actively point antenna to maintain

connection with the UAS during flight Autopilot ROSplane/ROSflight

Pixhawk APM Pixhawk PX4

ROSplane/ROSflight - Familiarity with autopilot architecture - High fidelity simulations with Gazebo - Easily modified and manipulated

Airframe MyTwinDream Anaconda Custom Airframe

MyTwinDream - Belly-landing capability - Large cargo volume and weight - Long, efficient flight

Page 4: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 4 -

1.2.1 Airframe The first major design decision for the airframe team was whether to use a fixed-wing or multi-rotor aircraft. While multi-rotor aircraft provide a significant advantage in many competition tasks including autonomous landing, waypoint capture, air delivery and geolocation, previous experience with fixed-wing aircraft and research into other systems led to the decision to develop a fixed-wing UAS. A battery-powered aircraft was selected due to its reliability and ease of use. Noting that airframe design only minimally factored into the competition points, purchasing an airframe allowed more time for the active development of other systems. Modifications were made to the purchased airframe, as discussed in Section 2.1. 1.2.2 Autopilot After evaluation of various autopilots, a standard, out-of-the-box autopilot was not found with sufficient functionality to fulfill the design requirements of the UAS. Instead of trying to dig into the undocumented, poorly structured, and unnecessarily complex software implemented on many of these autopilots, ROSplane, an autopilot created by students at BYU, was implemented and improved to fulfill the design requirements. The open-source nature of ROSplane, along with its corresponding firmware, ROSflight, provides an accessible platform for development of new algorithms and addition of functionality to complete competition tasks (Jackson, Ellingson, & McLain, 2016, June). Structured around the Robot Operating System (ROS), ROSplane provides the architecture for easy communication with and straight-forward troubleshooting of the autopilot system. A Gigabyte Brix serves as the onboard computer due to its light weight and large processing power. This processor was found to have sufficient power to run ROSplane and to process images onboard. 1.2.3 Imaging The payload capacity of the airframe limited suitable camera choices. As a result, the search was limited to cameras with adequate resolution and image quality that were within the allocated weight budget for the camera. In addition, the chosen cameras needed to provide clear images from an altitude of at least 100 feet while moving. The cameras chosen are described in Section 2.4. The UAS carries two cameras, each providing different functionalities. The front-facing camera has a wide field of view for locating targets on the first pass through the search area. The second camera helps facilitate image classification with its high resolution and optical zoom. 1.2.4 Interoperability Since TCP throughput scales over time in relation to reliability, an interoperability system that maintains a single connection with the judges’ server was designed. This was necessary because of the frequency and size requirements of data to be submitted. To send data on a single connection, all valuable intranet data is read by a unique ROS node and transmitted via http. 1.3 Programmatic Risks & Mitigations There are many inherent risks in developing a UAS from the ground up. As mentioned above, the largest risk was being unable to complete the UAS given the limited development time. At the commencement of development, each sub-team composed a list of development risks and plans for mitigation of these risks. The results of this list, compiled in Table 2, served as a guide for development practices throughout the year.

Page 5: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 5 -

Table 2 – Programmatic Risks & Mitigations

Risk Likelihood Impact Mitigation Strategy Damage to Aircraft or Components

High High Always have at least one backup of each component, especially a second airframe, fully modified to match the competition airframe

Autopilot Bugs Medium Low Many hours testing autopilot on cheaper, noncritical airframes before implementing on competition airframe

Computer Crash or Loss of Code

Medium High All code hosted in online repositories for access from any computer

Hardware Integration Issues

High Medium Define all hardware interfaces between systems at the beginning of system development

Software Integration Issues

High Medium Define all software interfaces between systems at the beginning of system development

2. System Design 2.1 Aircraft 2.1.1 Aerodynamics The aircraft is a modification of the purchased MyTwinDream airframe. It has a 2.31 m (91 in.) wingspan and is 1.22 m (48 in.) long. Carbon fiber rods along the wings provide flexural support to minimize wing deflection. Small carbon fiber strips provide additional rigidity for the elevator, rudder and ailerons, ensuring a more evenly distributed load when acted on by the servo. Because of the weight of the internal components, EPP foam wing extensions were added to generate greater lift and increase the L/D ratio. These extensions also help compensate for the loss of lift resulting from the water bottle’s placement under the wing. Flight testing showed that the camera opening in the fuselage has minimal aerodynamic effect. A foam landing rib was also added along the bottom of the airframe to prevent the ground from interfering with the propellers during belly landing.

Table 3 – MyTwinDream Airfame Specifications Main Wing Vertical Stabilizer Horizontal Stabilizer

Span 2.31 m (91 in.) Span 0.609 m (24 in.) Span 0.254 m (10 in.) Area 0.645 m2 (1000 in.2) Area 0.108 m2 (167.4 in.2) Area 0.016 m2 (24.8 in.2)

Aspect Ratio

8.27 Aspect Ratio

3.43 Aspect Ratio

4

2.1.2 Propulsion The UAS is powered by two Scorpion SII-3026-1190KV brushless DC motors (rated for 80 A) that drive 27.9 cm x 11.9 cm (11in. x 4.7in.) APC propellers. The speed controls are rated for 100 A, providing sufficient allowance in case of current spikes above expected output. Power for propulsion, as well as all other onboard components, is provided by a 4S 16,000 mAh 14.8 V lithium polymer battery. Using a single battery decreases weight and keeps the center of gravity in front of the center of pressure, increasing flight stability.

Page 6: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 6 -

2.1.3 Airframe and Interior The airframe and components are constructed and arranged to allow for simple access and adjustment. The battery occupies the nose of the UAS to balance out the weight of the other components. The camera and gimbal sit on the center of gravity for maximum image stability while the UAS is maneuvering, dodging obstacles, etc. The flight controller also sits just behind the center of gravity on a vibration damping platform to allow for accurate IMU data. A central PCB provides 3 different power rails as well as easy, locking connectors for servos and other peripherals. One power rail is dedicated to the Flip32 flight controller to supply clean, reliable power. Antennas are embedded into the foam of the airframe to allow for easy removal and to prevent obstruction of other components. The other onboard components (gimbal control board, Arduino, power booster, power splitter, and RC receiver) are fastened to modular wood slats that are easily secured or removed for flight (see Figure 1). 2.1.4 Gimbal The UAS features a single-axis gimbal (controlling the camera's elevation angle) that points the camera out the side of the UAS. This orientation allows the camera to capture images of ground targets while the aircraft is performing a loiter maneuver (see Sections 2.2.2 and 2.4). A single inertial measurement unit, mounted directly above the camera, provides accelerometer and gyroscope data to the BaseCam Simple BGC gimbal control board for stabilization. The gimbal angle is controlled using a custom ground station interface. Gimbal commands and encoder feedback are communicated between the gimbal and onboard computer through an Arduino microcontroller. 2.2 Autopilot 2.2.1 ROSflight The flight control unit (FCU) onboard the UAS is a Flip32 board with ROSflight firmware. ROSflight was developed at BYU to produce a cheap, open-source FCU for research and development in ROS. The ROSplane autopilot running on the onboard computer sends throttle, aileron, elevator, and rudder deflection commands to the FCU for actuation. While actuating these commands, the FCU relays IMU, differential pressure, and barometer data at rates up to 1 kHz to the onboard computer for state estimation (Jackson, Ellingson, & McLain, 2016, June). Onboard communications between the FCU and the computer are detailed in Figure 2. The RC commands from the safety pilot feed directly from the RC receiver to the FCU. The ROSflight firmware defaults to RC control of the UAS and only allows for control from the onboard computer if an override switch is activated. The state of the RC connection is constantly published from the FCU to the computer. When RC communication is lost or corrupted, failsafe measures are implemented in the ROSplane autopilot.

Figure 1 – Onboard Components

Page 7: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 7 -

Figure 2 – ROSflight Highlevel Architecture

2.2.2 ROSplane The autonomous flight and navigation capabilities of the UAS are realized through a new autopilot called ROSplane. ROSplane is a fully-featured fixed-wing autopilot that was developed at BYU, and is a direct implementation of the UAS controls architecture as presented in the book Small Unmanned Aircraft Theory and Practice (Beard & McLain, 2012). A high-level depiction of this control architecture is seen in Figure 3. Several members of the team have completed the UAS course here at BYU that studies and implements this text, so ROSplane was a natural choice for the autopilot. In addition to team member familiarity, complete control over all code implemented within ROSplane and the ROSflight firmware is available. There are no black boxes or unnecessary features that obscure or bloat the software. Features such as waypoint navigation and path planning are built in while features such as obstacle avoidance and gimbal pointing were easily added. A final factor in the team's selection of ROSplane is the ability to simulate all aspects of the autopilot within the Gazebo simulation environment. The ability to run high-fidelity simulations of the UAS in the lab with the exact control hardware and software used in flight, enables development to be accelerated and minimizes surprises during flight tests. To complete competition tasks, several functionalities have been added to the ROSplane autopilot. These modifications are described in the following sections.

- Failsafe: To adhere to competition rules, autonomous return to home (RTH) and autonomous flight termination have been added. After 30 seconds of communication loss, the UAS implements RTH, heading toward a waypoint placed 200 feet directly above its initial takeoff location. After 3 minutes of communications loss, the UAS autonomously terminates flight by commanding throttle closed, full up elevator, full right rudder, and full right aileron.

Figure 3 — ROSplane control architecture

Page 8: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 8 -

- Autonomous Landing: A new state was added to the autopilot's state machine to fulfill the autonomous landing task. In this new state, the UAS flies to the end of the runway and spirals down until a certain altitude is reached. The UAS then maintains its course along the runway while slowing decreasing altitude until touchdown.

- Loiter Points: ROSplane's waypoint system generally only allows for waypoints indicating a path. In addition to this path, optional loiter points were added. At each loiter point, the UAS will make a complete circle at a constant altitude before moving on to the next waypoint.

- Waypoint Interrupts: ROSplane, as it was, could only fly a list of waypoints from start to end, with no

changes to the waypoint path during flight. The autopilot was modified to allow the waypoint path to be updated mid-flight to complete the moving obstacle avoidance task.

2.2.3 Ground Control Station For real-time communication with the UAS during flight, a custom ground control station (GCS) was built from the ground up. The GCS was created as an additional plugin for a ROS GUI interface. The custom GCS was created for reasons similar to those that motivated a BYU-built autopilot. This GCS facilitates simple communication between the UAS and the ground station. No additional data manipulation is required beyond the messages that ROSplane already outputs and processes internally. The GCS subscribes to ROSplane’s telemetry data that describe its current state and waypoint path. It also subscribes to real-time obstacle and mission boundary information being published by the ground interoperability client. UAS telemetry data is received at a rate of 100 Hz. The GCS then performs the following functions:

- Display UAS pose, path, and boundary data: This is accomplished through modules 1, 2, and 3 as shown in Figure 4. Module 1, the map widget, superimposes the UAS’s GPS position and heading on a map of the user’s choosing. In a similar fashion, waypoints and competition boundaries are drawn on top of the map layer, updating at 10 Hz. Module 2 displays an artificial horizon, modeled after existing ground stations such as Mission Planner. From this module, data about the UAS’s pitch, roll, yaw, airspeed, and altitude can be viewed at all times. The user can toggle between different parameters that are dynamically graphed in module 3.

- Handle current path and waypoints: Waypoints are handled through module 4, which consists of a pop-up

window that allows the user to load waypoint files, insert new waypoints, or delete specific waypoints. Such an interface accommodates both pre-determined flight paths as well as real-time flight adjustments for sweeping searches and emergent tasks.

- Implement sensor calibration, loiter, and return to home commands: Such commands are handled in

module 5, which consists of another pop-up window, allowing the user to specify commands to alter the behavior of the UAS mid-flight. All commands to the UAS are sent over the ROS network.

Page 9: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 9 -

Figure 4 – Ground Control Station Interface

2.3 Obstacle Avoidance 2.3.1 Static Obstacles During autonomous flight, the UAS flies between waypoints along time-minimal Dubins paths. If these paths are planned unconstrained, violations of competition boundaries or collisions with obstacles are likely to occur. To avoid these collisions, path planning is done at the flight-line using a rapidly-exploring random tree (RRT) algorithm. A high-level overview of the algorithm is defined by the steps below.

1. All obstacles and boundaries are inflated to account for wing length and potential disturbances such as gusts of wind.

2. Dubins paths are planned between each waypoint in the waypoint sequence.

3. Each segment along the Dubins path is checked for collision.

4. When a collision is detected between two waypoints, RRT explores possible solution paths until a path is successfully found connecting the two waypoints.

5. The RRT solution path is smoothed to produce a minimum number of additional waypoints.

6. Additional waypoints are inserted into the waypoint sequence to avoid the stationary obstacle.

An example RRT solution between waypoints can be seen in Figure 5 (Beard & McLain, 2012).

Figure 5 - Example RRT Solution

Page 10: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 10 -

2.3.2 Moving Obstacles Dynamic obstacles provide a more difficult challenge in collision avoidance. Accurate long term predictions of location for such objects are not realistic. Instead of planning for these obstacles a priori, the UAS continuously estimates the locations and velocities of dynamic obstacles relative to its own location. When the UAS draws within a certain distance threshold from a moving obstacle, the ground station runs an RRT avoidance algorithm. This algorithm produces additional waypoints to avoid all static and moving obstacles. 2.3.3 Algorithm Robustness RRT by its nature is random, and the path is not guaranteed to be optimal. To counter difficult-to-foresee situations that could lead to violations of competition rules, path planning and managing software are designed with margins of safety in mind. During the initial planning, boundaries are moved inwards by a fixed distance in the software’s internal representation. This will help keep the UAS from moving out of bounds due to disturbances such as wind gusts. These altered boundaries are also used during path planning around dynamic obstacles. 2.4 Imaging System The circumstances of the competition pose a set of unique challenges when it comes imaging. The first challenge is gathering high-quality imagery of ground targets from a moving UAS at altitude. Obstacles to be overcome for this challenge include motion blur, timing, and having sufficient resolution to resolve target features. A second major challenge is that of data bandwidth between the UAS and ground station. If the image data rate is too high, or image resolution and files are too large, the reliability of image transfer can be significantly reduced and images may become significantly delayed or even lost. The last major challenge is the camera's physical size. Since the imaging sensor must be carried by the UAS that has very limited payload capacity, the imaging sensor must be compact and relatively lightweight.

Figure 6 – Image System Diagram

To overcome the aforementioned challenges, a set of requirements for the primary target characterization camera were outlined as follows:

• Modest sensor resolution, yet sufficient to resolve target features • Internal image stabilization to minimize motion blur • High quality optics • Optical zoom • Fully configurable camera parameters • Computer control interface • Compact size and low weight

Figure 7 – Sony FCB-EV7500 Industrial

Block Camera

Page 11: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 11 -

With this set of requirements in mind, the Sony FCB-EV7500 industrial block camera was selected. The EV7500 features a high quality 2.38 megapixel CMOS sensor, a 30X stabilized optical zoom, and supports the Sony VISCA serial control interface. These components fit inside a package that is 50 x 60 x 90 mm in size. By utilizing the optical zoom on the EV7500, target features can be resolved from significant distances even though the image sensor resolution is quite modest (see Figure 8). The EV7500 also has an outstanding track record in the field of UAS imaging. The EV7500 is a primary image sensor found in several state-of-the-art professional gimbal systems for UAS and full-scale aircraft. A unique aspect of the UAS imaging system is the incorporation of two separate image sensors. The first and primary sensor is the Sony EV7500 which was described previously. The EV7500 is mounted in a single-axis brushless gimbal which is oriented to point out the right side of the UAS. The second sensor is a very small first person view (FPV) camera that is fixed to the nose of the UAS in a forward and down orientation. The purpose of the nose camera, or spotter camera, is to spot and provide an initial estimate of ground target locations. The purpose of the EV7500 is to gather high-quality images, and to refine the geolocation of targets as the UAS flies a loiter path around the estimated target location. This process of detection, initial localization, refined localization, and target characterization will be further detailed in the following section. 2.5 Target Detection, Classification, Localization 2.5.1 Manual Detection, Classification and Localization Target detection is based on operator input. As the UAS flies through the target search area, an operator on the ground watches the video feed from the nose-fixed spotter camera. When a target is seen in the video feed, the operator simply clicks on the target's location in the image frame. With each click a rectified pixel coordinate is returned. This pixel coordinate, along with pose data from the autopilot, is then used to generate an estimate of the target’s location. The operator can increment or decrement the target counter at any time as new targets move into view. It is advantageous to click multiple times on the target because the localization algorithm computes an average of all the calculated locations for each target. This minimizes operator error. When the UAS has finished flying its search path, a list of estimated target locations is available for the UAS to revisit in a loiter flight mode (see Figure 9).

Figure 9 – Search area scan using the spotter cam (left) and individual target scan with the EV7500 (right)

Figure 8 – View of Y Mountain through the EV7500 with no zoom (left) and full optical zoom (right)

Page 12: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 12 -

Localization is performed by assuming a flat Earth model (Beard & McLain, 2012). A target’s location in the inertial frame can be given by the following equation:

𝒑𝒑𝑜𝑜𝑜𝑜𝑜𝑜𝑖𝑖 = �𝑝𝑝𝑛𝑛𝑝𝑝𝑒𝑒𝑝𝑝𝑑𝑑� + ℎ

𝑅𝑅𝑜𝑜𝑖𝑖 𝑅𝑅𝑔𝑔𝑜𝑜𝑅𝑅𝑐𝑐𝑔𝑔𝒍𝒍𝑐𝑐

𝒌𝒌𝑖𝑖𝑅𝑅𝑜𝑜𝑖𝑖 𝑅𝑅𝑔𝑔𝑜𝑜𝑅𝑅𝑐𝑐𝑔𝑔𝒍𝒍𝑐𝑐

(𝐸𝐸q. 1)

Definitions for each term in Equation 1 can be found in Small Unmanned Aircraft Theory and Practice (Beard &McLain, 2012). An image classification GUI serves as the main tool for submitting cropped images and target characteristics. The GUI displays all the images associated with a specific target and allows the user to select the best image. Every image displayed on the GUI has corresponding geolocation and heading data. Upon selecting an image, an average of all geolocation values (omitting outliers) is calculated. The user then uses the GUI to rotate, crop, and classify the target. The rotation of the image on the GUI is combined with the stamped heading data to calculate the final target heading. Cropping the image produces a target which is clearly visible and fills more than 25% of the submitted image. In order to classify the target, there are input fields for each characteristic (alphanumeric, alphanumeric color, target shape, etc.) that are packaged along with geolocation data at submission time. Once all these values have been calculated and entered, they are submitted via the interoperability system through the click of a button.

Figure 10 – Image editing and classification GUI

Page 13: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 13 -

2.5.2 Autonomous Detection, Classification and Localization Autonomous image processing implements two convolutional neural networks. These deep-learning algorithms are built using the TensorFlow open source machine learning library. The system is comprised of two subsystems, one for shape classification and one for character classification (Figure 11). Both of these subsystems use the same training structure by separating the convolution network into a main training graph and a spatial transformer graph. The two subsystems are trained according to their function – classifying shapes or characters.

Figure 11 – Autonomous Classification

Main Training Graph The training graph contains over 20 different sub-layers, each representing distinct convolutional layer mixed with averaging (pooling) layers. The graph ends with a fully connected layer and a normalized output to get the percent probability in which the network believes the letter is accurately labeled. The input is a standard image of any size with no rotation or skewed distortions. Spatial Transformer Graph The spatial transformer graph accounts for the rotations and non-cropped images that will be received from the EV7500 camera. This layer uses a basic graph as shown in Figure 12. In this figure, θ is a set of operations which are applied to an image. As the graph learns θ, it is then able to apply these transformations to the original image. The last step of the network is to apply θ to the original image and feed a cropped and correctly scaled image through to the main training graph (Jaderberg, Simonyan, Zisserman, & Kavukcuoglou, 2015).

Figure 12 – Spatial Transformer Network

Page 14: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 14 -

2.6 Communications The UAS contains three independent communications systems described below. 1.3 GHz: The 1.3 GHz connection between the UAS and ground station is reserved for analog video from the spotter camera on the nose of the UAS. 2.4 GHz FHSS: To ensure reliable RC control, the safety pilot communicates with the UAS on a separate 2.4 GHz frequency hopping spread spectrum. This communication occurs between a Taranis Q X7 transmitter and a D4R-II receiver on-board the UAS. Only the raw RC commands are sent over this network throughout flight. 2.4 GHz: The main data link between the UAS and the ground control station occurs over a 2.4 GHz connection. A Ubiquiti airMAX® Sector antenna on the ground and a Ubiquiti PicoStation™ onboard the UAS provide the bridge for all data transfer through the ROS network. The UAS publishes telemetry data and images while subscribing to waypoint commands from the ground station. Communication between systems on the ground also happens via the ROS network through an Ethernet switch. 2.7 Air Delivery Air delivery capability is enabled through the creation of a delivery capsule and a payload drop algorithm that takes into account UAS airspeed, altitude, heading, and wind. Based on water bottle drop tests, a single-use capsule has been designed to protect the bottle on impact. This capsule, attached to a parachute for guidance, is constructed from high density cardboard tubing and ASTM D5511 certified biodegradable foam. Release of the capsule is accomplished with a PWM-driven solenoid that is embedded into the underside of the left wing. A MATLAB script was written to simulate the dynamics of the free-falling capsule to determine the optimal drop location from the UAS. 2.8 Cyber Security Using a STRIDE threat model, important assets and threats were identified. This knowledge allowed for decisions to be made with security in mind. For instance, the wireless radio is one critical asset that an attacker could target to sniff data transmitted from the UAS. To counter that possibility, a Ubiquiti Rocket controller that supports AES encryption is being used. In addition, a multitude of threats are countered with a simple firewall that supports Network Address Translation. Without NAT, all systems would be exposed, and potential hackers would have a wide attack surface. Through these methods, there is protection against a variety of possible threats.

Figure 13 – Ground Station Communication

Page 15: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 15 -

3. Test & Evaluation Plan 3.1 Developmental Testing 3.1.1 Motor/Propeller Testing To help determine the optimal propulsion system, many tests were performed using different propellers and motors. The three motors tested were the Scorpion SII-4025-520KV, Turnigy D3548, and T-Motor MN3110-15. Each motor was tested with various propellers to find which motor/propeller combination would give the most thrust for the least amount of power. Parameters of interest included propeller length, pitch angle, and material. The tests were performed using a thrust stand. The test consisted of slowly increasing the motor amperage from 0 A to 50 A. At each amperage step, the resulting thrust and energy were measured. The test results are seen in Figure 14. The test showed the 11in. x 4.7in. APC propeller with the Scorpion motor was optimal. This combination produced about 2 kg of force while using only 500 W of power. 3.1.2 Air Delivery To protect the bottle upon impact, different materials and enclosures were tested. Test materials for softening the impact included high and low-density foams, shock absorbing rubbers, and mats of drinking straws. The effect of drag behind the enclosure was also tested. Each design included a drag chute or streamer to maintain vertical orientation during freefall. Out of 11 tests, four succeeded. In each successful test, the bottle landed bottom-down, showing that proper orientation was more important than container material. The final design uses a parachute on the payload to keep it vertically oriented during descent. 3.2 Individual Component Testing 3.2.1 Autonomous Flight Autonomous flight was tested in three phases: simulation, test flights on non-critical airframes, and final tests. This multi-phase design was used to increase safety, mitigate risks, and decrease development time. Simulation The autopilot was first tested in the Gazebo simulator, which uses high-fidelity, nonlinear equations of motion to simulate the response of the UAS. This simulator enabled a very short and efficient iteration cycle to rapidly identify and correct significant implementation errors in ROSplane. Test flights in the simulator were also significantly easier, cheaper, and less risky to debug. For waypoint testing, the simulation was started with a predetermined set of points for the UAS to fly. As the UAS flew, the simulator plotted its path against the provided waypoints, as shown in Figure 15. Waypoint accuracy improved when additional points were added along the planned path. This helped the UAS be in the right position to achieve maximum accuracy for the graded waypoints.

0

0.5

1

1.5

2

2.5

0 100 200 300 400 500 600 700 800

Thru

st (k

gf)

Power (W)

Scorpion Motor Tests APC C-29x6

AirScrew10x5

T2818PAirScrew

CarbonFiber12x4.5

APC C-211x7

APC 12x6 P

KN1 160AirScrew

APC 11x4.7

APC 10x4.7

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

0 50 100 150 200 250 300 350 400

Thru

st (kg

f)

Power (W)

TMotor Motor Tests APC C-29x6

Airscrew10x5

CarbonFiber12x4.5APC C-211x7

APC C-212x6

AirscrewKN1 160

APC 11x4.7

APC 10x4.7

Figure 14 – Scorpion and TMotor Tests

Page 16: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 16 -

For state estimation testing, the simulator provided access to both the true, simulated states and ROSplane’s estimated states. This made it possible to test the accuracy and limitations of the state estimators. Simulations of many different possible flight paths (i.e. climbing, turning, loitering) were flown and graphs of true versus estimated states were created. To test obstacle avoidance capabilities, a 2D image was created with a small number of stationary obstacles for a given altitude. The number of stationary obstacles was increased for each simulation. The simulations showed that the RRT algorithm will find a solution path for any number of static obstacles provided a solution path existed and enough time was allotted. The algorithm concentrated on simulating and avoiding cylindrical and spherical obstacles, both of which will be encountered during the competition. The algorithm was also tested to ensure that it will work with differently sized and shaped borders, roughly based off the competition areas from previous years.

Flight Tests Flight tests were first carried out on the lighter, less expensive Bixler aircraft (Figure 16), and subsequent tests have been done on the MyTwinDream aircraft. Testing waypoint navigation was similar to the simulator. The UAS was commanded to fly a path defined by waypoints. During flight, the ground control station showed the measured path of the UAS versus the commanded waypoints. In these tests, the waypoint error for each waypoint was measured. To test the UAS’s obstacle avoidance algorithm, custom missions were setup using the judges’ server. A few stationary obstacles were created and positioned near the waypoints. To verify that the UAS avoided these obstacles, the ground control station plotted both the UAS’s path and the obstacles on a map. This was also visualized in the judges’ server to ensure that the obstacles, paths, and waypoints were in agreement. When the UAS successfully avoided a few stationary obstacles, a few moving obstacles were added. Different shaped competition boundaries were also used in these tests. When the UAS was able to avoid all the obstacles, more difficult tests were created (such as waypoints between stationary obstacles and the competition boundary).

Figure 15 – Autopilot Simulation

Figure 16 - Bixler Aircraft

Page 17: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 17 -

3.2.2 Imaging Testing and experience has shown that the image sensors on the UAS can be reliably accessed by both the onboard computer and the GCS as long as procedures are followed. To minimize opportunity for error, scripts have been written to mitigate the chance of human input error regarding the launching or accessing of image sensors. Several test runs have been conducted to show that images can be reliably transferred from the onboard EV7500 camera through the target classification and localization system and then uploaded via the interoperability server. The greatest opportunity for failure in this process has been improper setup of the ROS communications network. To ensure that improper setup does not occur, scripts have been written and tested to execute this setup automatically. 3.2.3 Object Detection, Classification, and Localization To ensure the reliability of the target detection, localization, and classification process, all elements of the UAS imaging system will undergo rigorous testing prior to the competition. A major factor in the success of the imaging system is the imaging operator’s ability to clearly see and detect targets in the video feed from the spotter camera. Given that the spotter camera’s analog video feed has limited quality and resolution, tests will be performed to determine the optimal height above ground from which the search area should be initially scanned. The altitude must be high enough to ensure full coverage of the search area while low enough for targets to be easily spotted. To determine this optimal search altitude, flight tests in a variety of terrain and lighting conditions will be conducted. Initial flight tests with the spotter camera have been promising. In a similar fashion, the EV7500 camera also needs to undergo further testing to determine the ideal altitude, loiter radius, and zoom setting for acquiring good imagery of ground targets. Tests on the ground have shown that without zoom, mid-sized targets can be easily seen and distinguished at approximately 60 m (197 ft). Further tests from the air will later be conducted to determine how much zoom, if any, is needed. To test the localization algorithm, the UAS will be flown while using the imaging interface to localize ground targets of known locations. The error between the calculated and actual target location will then be measured to determine if the algorithm is functioning properly. Target classification may occur either manually via the imaging GUI or autonomously by the machine learning algorithm. To ensure that manual classification is reliable, a user will use the GUI to classify multiple target image sets. All GUIs will be tested to verify that unexpected errors or crashes will not occur. Testing of the autonomous, machine learning algorithm will also take place during these flight tests. 3.2.4 Communications Ground-based range tests were performed to evaluate the reliability of communications between the ground station and the aircraft. The UAS was mounted to the top of a vehicle, and the distance between the UAS and GSC was gradually increased. In this test, the UAS was sending images from the EV7500 camera at a rate of 1 Hz. The test results showed that autopilot telemetry data and camera image data could be reliably communicated at up to 1.13 km (0.7 mi), which is adequate given the size of the competition grounds. Further testing will be performed to ensure that data communication range from the air match or exceed this initial test. Similar range tests will also be conducted to ensure that the communication range of the safety pilot RC link and 1.3 GHz analog video link are more than adequate for the competition. 3.2.5 Air Delivery Air delivery tests will be performed first manually and then autonomously. The drop mechanism has already been tested and proven reliable on the ground. Manual air delivery tests will be conducted in flight to confirm reliable drop with minimum latency. Following manual testing, the autonomous delivery algorithm will be tested to ensure that the drop mechanism releases at the appropriate time. 3.3 Mission Testing Plan Starting at the end of April, weekly mission tests will be performed until the competition. Each test will be evaluated using the judges’ interoperability server. Additional complexity will be added to the mission test each week. Each of these tests is outlined in the schedule shown in Table 4.

Page 18: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 18 -

Table 4 – Test Schedule Week of Test Description May 1 Manual Flight, Manual Classification, Manual Air Delivery May 8 Autonomous Flight, Waypoints, Search Area, Manual Classification, Autonomous Air Delivery May 15 Autonomous Flight, Waypoints, Search Area, Manual Classification, Autonomous Air Delivery,

Stationary Obstacles May 22 Autonomous Flight, Waypoints, Search Area, Manual Classification, Autonomous Air Delivery,

Stationary Obstacles, Moving Obstacles, Autonomous Classification May 29 Full Mock Competition June 5 Full Mock Competition

Successful tests will have results similar to those shown in Table 5. These results will indicate that the project is on track for the competition.

Table 5 - Predicted Results

If mission testing yields undesirable results, the team will evaluate the issue, magnitude, and solution time. Issues that are easy to resolve and/or worth more points will be given higher priority. Improvements that are not implemented due to time constraints will be noted for development by future teams.

4. Safety, Risks, & Mitigations Throughout all phases of development and testing of the UAS, safety has been a top priority. In all circumstances, safety controls were put in place and strictly adhered to by members of the development team. The follow sections outline potential safety hazards and detail safety controls to mitigate risk.

Mission Task Mission Sub-Task Predicted Results

Description

Autonomous Flight Autonomous Takeoff

100% Smooth takeoff from hand launch

Autonomous Landing

100% Smooth, level belly landing

Waypoint Capture 100% Fly within 30.48 m (100 ft) of all waypoints Obstacle Avoidance Stationary Obstacle

Avoidance 85% Avoiding obstacles completely except when

wind blows aircraft off course Moving Obstacle Avoidance

75% Avoiding obstacles completely except when wind blows aircraft off course

Object Detection, Classification, Localization

Manual Characteristics

75% Clearly identify shape, color, and character of ground targets

Geolocation 75% Target location identified within 12.2 m (40 ft) of true target geolocation

Actionable 100% All images will be submitted while UAS is airborne

Autonomy 50% Autonomous identification will be correct for half of the targets

Interoperability 100% All identified images will be submitted via the Interoperability System

Air Delivery Delivery Accuracy 75% Water bottle delivered within 12.2 m (40 ft) of target delivery point

Safe Delivery 100% Water bottle unbroken after impact

Page 19: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 19 -

4.1 Developmental Risks & Mitigations The development process posed several safety risks, described in Table 6 below.

Table 6 - Developmental Risks and Mitigations

4.2 Mission Risks & Mitigations The competition mission and autonomous flight pose a number of safety risks outlined in Table 7.

Table 7 - Mission Risks and Mitigations

Developmental Safety Risk Mitigation Dynamometer Test Stand Safety Hazard -A fiberglass shield installed to block the dynamometer

from the rest of the room -Safety glasses worn during testing -Testing personnel stood at safe distance during all tests -Software cutoff failsafe active during all tests

Water Bottle Drop Test Safety Hazard -University Police granted permission for the tests -Cones and personnel stationed around drop test impact zone to warn any persons nearing the site -Tests performed during early morning hours to minimize human risk

Autopilot/Communications Malfunction -Experienced safety pilot ready to take manual control in case of autopilot malfunction -UAS kept within safe range of safety pilot line of sight -Audible, low-signal warning should UAS approach range limit

UAS Crash Damage or Injury -Test flights conducted in large, remote area -Test personnel stationed near safety cover structure -Personal kept safe distance during launch, testing, and recovery of UAS

Mission Safety Risk Mitigation Autopilot Malfunction -Experienced safety pilot ready to take manual control in

case of autopilot malfunction Loss of Communications with UAS -30 second communications loss triggers RTL

-3 minute communication loss triggers flight termination Unintentional Water Bottle Drop -Water bottle mechanism disarmed by default

-Only armable when safety pilot switch enabled Unintentional Equipment Drop from UAS -Internal components securely fastened inside UAS

-External components (water bottle assembly, antennas, etc.) securely attached and checked prior to flight -Locking fasteners and thread-lock used where applicable

Electrical Fire -Safety factor of 1.5 (minimum) for all electrical power requirements -Batteries balance-charged and inspected for damage -All charging done in fireproof box

Loss of Power During Flight -Battery chosen to allow 1.5 times required flight time -Battery checked for full voltage before flight

Loss of Propeller -Propellers inspected before flight to ensure secure attachment and no blade damage

Loss of Control of Control Surfaces -Control surfaces inspected before flight -Servo linkages reinforced and inspected before flight

Page 20: BYU UAV Team Technical Design Paper for AUVSI Student UAS ... · BYU UAV Team - 1 - BYU UAV Team . Technical Design Paper for AUVSI Student UAS Competition 2017 . Brigham Young University

BYU UAV Team

- 20 -

4.3 Operational Risks & Mitigations Before flight, the UAS is carefully inspected by the safety pilot according to an extensive pre-flight checklist. The checklist for the safety pilot includes, but is not limited to, the following procedures:

1. Inspecting the motors to verify that propellers and motors are securely fastened 2. Carrying out a static thrust test and test of all control surfaces 3. Checking the weight and balance of the UAS 4. Inspecting the battery for damage, checking that it is fully charged 5. Inspecting all control surfaces and servo linkages 6. Inspecting the airframe for damage 7. Inspecting all components to ensure secure attachment

A physical safety key and an electronic switch mitigate the possibility of unintentional arming. The motors remain disarmed (besides the static thrust test) until all checklists are completed, all team personnel are at their designated posts, and the UAS is cleared for takeoff. In addition to these procedures, each member of the team completes their respective safety checklist. Operational safety is increased with several safety items at the flight line. These safety items include a small fire extinguisher, drinking water, sunscreen, safety glasses, helmet for the person launching/recovering the UAS, and a first aid kit. These safety measures provide a safe, operational environment for the team and spectators.

5. Conclusion The BYU UAV Team has designed, built and tested a UAS ready to excel at mission tasks in the 2017 AUVSI Student UAS Competition. Throughout the course of the year, a system engineering approach has been valuable to converge on a successful design and to mitigate development risks. The final UAS design features an innovative autopilot, a state-of-the-art imaging system, and a robust communications network. Extensive testing has been performed to guarantee the successful completion of competition tasks. Frequent testing will continue during the preparation period to ensure reliability and satisfactory operator training.

References Beard, R. W., & McLain, T. W. (2012). Small Unmanned Aircraft Theory and Practice. Princeton: Princeton

University Press. Jackson, J., Ellingson, G., & McLain, T. (2016). ROSflight: A lightweight, inexpensive MAV research and

development tool. Unmanned Aircraft Systems (ICUAS), 2016 International Conference, 758-762. Jaderberg, M., Simonyan, K., Zisserman, A., & Kavukcuoglou, K. (2015). Spatial Transformer Networks. Advances

in Neural Information Processing Systems, 2017-2025.