simplistic sonar based slam for low-cost unmanned … sonar based slam for low-cost unmanned aerial...

9
Simplistic Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North Carolina at Pembroke Pembroke, North Carolina [email protected] Leighanne Hsu Computer Science The College of New Jersey Ewing, New Jersey [email protected] Saad Biaz Auburn University Auburn, Alabama [email protected] Chase Murray Auburn University Auburn, Alabama [email protected] Abstract—GPS-denied environments pose a significant problem for unmanned aerial vehicles, or UAVs, operating autonomously within a confined airspace. In this paper, we outline and im- plement a basic algorithm to autonomously fly a quadcopter using low-cost sonar sensors in a GPS-denied environment. Furthermore, we outline a method for using simple sonar distance data to map our environment. Through the use of these sonar sensors and simplistic mapping methods, we build the foundation for a SLAM-based approach to operating indoors using a low- budget system. I. I NTRODUCTION Unmanned aerial vehicles (UAVs) are growing popular around the world for both military and private sectors. They have provided us with new tools in which to accomplish tasks faster, more efficiently, and most importantly safely. However, a major disadvantage of the fixed wing UAVs is their need for copious amounts of airspace. They are not ideal in situations in which there is limited airspace, nor are they capably of any sort of stable indoor flight. Additionally, many UAVs depend on accurate GPS systems in order to complete their tasks. To handle tasks which afford only minimal airspace the helicopter based UAVs were developed. with the ability to hover and remain stable in a limited airspace, helicopter based systems open UAVs up to a new realm of accomplishable tasks. These tasks include search and rescue, mapping, and observation of targets within a limited airspace such as a build- ing. Furthermore, many of the roles a helicopter based UAV are designed precisely for are in GPS-denied environments such as buildings. Autonomous flight therefore becomes more complicated, as your have to take into account a variety of vari- ables which are not always static. To overcome these variables which inhibit flight indoors is quickly becoming a research topic for unmanned aerial vehicles. As mentioned before a major research initiative into indoor flight has the quadcopter performing autonomous missions without the guidance of a pilot. However, this brings to light navigational issues. In a GPS-denied area, such as a building, the first problem to overcome is that ones exact location and environment are unknown. Typical solutions to this problem are implemented using Simultaneous Localization and Mapping (SLAM) ap- proaches. There are several examples of this approach using cameras [1], [3], [6], [7], [8], [11], [12], lasers [9], [10], and sonar [5]. Of these examples, many are ground-based robots [1], [5], [9], [11], or aquatic-based systems [14]. But to the best of our knowledge, no airborne SLAM algorithm takes in only sonar data. Current airborne systems use lasers [10], cameras [6], [8] as well as a combination of sonar, laser, or cameras [3], but none that only utilize low cost sonar on an aerial platform. Laser and cameras can be expensive, both in terms of computation and power as well as in terms of weight. As such, our goal was to design and develop a low- cost, simplistic sonar-based SLAM approach for flying a UAV autonomously in a GPS-denied environment, without the use of more expensive laser or visual-based sensors for targeted path planning, nor the need for a ground station to handle computation. II. LITERATURE REVIEW GPS is a convenient way of determining ones location, but not all environments are GPS-accessible. Underwater and indoor systems do not have this luxury and therefore need a different method of tracking their own location. Simultaneous Localization and Mapping (SLAM) algorithms allow such systems to map out their environments using sensor data, then track their location within the map of their environment. However, sensors can be inaccurate, and the system has no way to pinpoint its location for certain. Instead, the system uses its sensors to obtain a probability of whether or not it is actually located where it believes it is located. SLAM algorithms typically identify landmarks with which to travel in relation to. These allow the system to travel and track its location by calculating a probability of its location [7]. SLAM algorithms often combine information from a variety of sensors. Three main types of systems are visual, laser-based, and sonar-based systems. A. Filters Localization involves the use of many different sensors, which are necessary to determine the heading, distance, and speed of the system while in motion. These sensors are often prone to drift. After some amount of time, they will become increasingly prone to error. As the degree of error increases, the exact location of the system will deviate more and more from its estimated location. Because of this uncertainty, data

Upload: vobao

Post on 08-Jul-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Simplistic Sonar based SLAM for low-cost Unmanned … Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North

Simplistic Sonar based SLAM for low-costUnmanned Aerial Quadrocopter systems

Christopher R. HudsonComputer Science

University of North Carolina at PembrokePembroke, North Carolina

[email protected]

Leighanne HsuComputer Science

The College of New JerseyEwing, New Jersey

[email protected]

Saad BiazAuburn UniversityAuburn, [email protected]

Chase MurrayAuburn UniversityAuburn, Alabama

[email protected]

Abstract—GPS-denied environments pose a significant problemfor unmanned aerial vehicles, or UAVs, operating autonomouslywithin a confined airspace. In this paper, we outline and im-plement a basic algorithm to autonomously fly a quadcopterusing low-cost sonar sensors in a GPS-denied environment.Furthermore, we outline a method for using simple sonar distancedata to map our environment. Through the use of these sonarsensors and simplistic mapping methods, we build the foundationfor a SLAM-based approach to operating indoors using a low-budget system.

I. INTRODUCTION

Unmanned aerial vehicles (UAVs) are growing populararound the world for both military and private sectors. Theyhave provided us with new tools in which to accomplish tasksfaster, more efficiently, and most importantly safely. However,a major disadvantage of the fixed wing UAVs is their need forcopious amounts of airspace. They are not ideal in situationsin which there is limited airspace, nor are they capably ofany sort of stable indoor flight. Additionally, many UAVsdepend on accurate GPS systems in order to complete theirtasks. To handle tasks which afford only minimal airspace thehelicopter based UAVs were developed. with the ability tohover and remain stable in a limited airspace, helicopter basedsystems open UAVs up to a new realm of accomplishabletasks. These tasks include search and rescue, mapping, andobservation of targets within a limited airspace such as a build-ing. Furthermore, many of the roles a helicopter based UAVare designed precisely for are in GPS-denied environmentssuch as buildings. Autonomous flight therefore becomes morecomplicated, as your have to take into account a variety of vari-ables which are not always static. To overcome these variableswhich inhibit flight indoors is quickly becoming a researchtopic for unmanned aerial vehicles. As mentioned before amajor research initiative into indoor flight has the quadcopterperforming autonomous missions without the guidance of apilot. However, this brings to light navigational issues. In aGPS-denied area, such as a building, the first problem toovercome is that ones exact location and environment areunknown. Typical solutions to this problem are implementedusing Simultaneous Localization and Mapping (SLAM) ap-proaches. There are several examples of this approach usingcameras [1], [3], [6], [7], [8], [11], [12], lasers [9], [10], and

sonar [5]. Of these examples, many are ground-based robots[1], [5], [9], [11], or aquatic-based systems [14]. But to thebest of our knowledge, no airborne SLAM algorithm takesin only sonar data. Current airborne systems use lasers [10],cameras [6], [8] as well as a combination of sonar, laser, orcameras [3], but none that only utilize low cost sonar on anaerial platform. Laser and cameras can be expensive, bothin terms of computation and power as well as in terms ofweight. As such, our goal was to design and develop a low-cost, simplistic sonar-based SLAM approach for flying a UAVautonomously in a GPS-denied environment, without the useof more expensive laser or visual-based sensors for targetedpath planning, nor the need for a ground station to handlecomputation.

II. LITERATURE REVIEW

GPS is a convenient way of determining ones location,but not all environments are GPS-accessible. Underwater andindoor systems do not have this luxury and therefore need adifferent method of tracking their own location. SimultaneousLocalization and Mapping (SLAM) algorithms allow suchsystems to map out their environments using sensor data,then track their location within the map of their environment.However, sensors can be inaccurate, and the system has noway to pinpoint its location for certain. Instead, the systemuses its sensors to obtain a probability of whether or notit is actually located where it believes it is located. SLAMalgorithms typically identify landmarks with which to travelin relation to. These allow the system to travel and trackits location by calculating a probability of its location [7].SLAM algorithms often combine information from a varietyof sensors. Three main types of systems are visual, laser-based,and sonar-based systems.

A. Filters

Localization involves the use of many different sensors,which are necessary to determine the heading, distance, andspeed of the system while in motion. These sensors are oftenprone to drift. After some amount of time, they will becomeincreasingly prone to error. As the degree of error increases,the exact location of the system will deviate more and morefrom its estimated location. Because of this uncertainty, data

Page 2: Simplistic Sonar based SLAM for low-cost Unmanned … Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North

Fig. 1. The essence of the Slam Problem. Estimation of landmark and robotlocations are required. [7]

from many sensors is often necessary to try and correct theamount of error in determining the location of the system. Todo this, filters such as the Kalman filter and its variants areutilized.

The Kalman filter is commonly used for data fusion. Itis composed of two models: the process model and theobservation, or measurement, model. These are assumed to beindependent, and as such can be calculated separately [15].This filter is a Bayesian filter which works on linear systemsand assumes the noise affecting the process and observationmodels has a Gaussian distribution [6]. In the equationsbelow, A represents the state transition, B the motion model,and C the sensor model. The errors are represented with Ex

for process noise and Ez for observation noise.

Process Model

∗xt = Axt−1 + Bµt (1)

Σ̄t = AΣt−1AT + Ex (2)

Observation Model

Kt = Σ̄CT (CΣ̄tCT + Ez)−1 (3)

xt = x̄t +Kt(zt − Cx̄t) (4)

Σt = (I −KtC)Σ̄t (5)

The purpose of the process model is to predict the state ofthe system. This typically uses the combination of the locationof the system at a given time and the velocity vector appliedat that time which results in the next position of the system.This is also known as the state estimate. To determine this, anestimate of the mean (1) and the covariance (2) are computed.

Then a measurement update is applied. This analyzes thelocation of the system based on distance measurements tolandmarks or the surrounding environment. (4) and (5) return astate and covariance estimate update based on sensor data. (3)describes the Kalman gain which weights the measurementsagainst the state estimate. A high gain indicates a strongcorrelation between the measurements and the state estimate.A low estimate indicates a discrepancy, in which case moreemphasis is placed on the measurements. In other words, ifthe result returned is fairly close to that of the process model,then the predicted state is accepted as accurate. Otherwise,it is biased toward the measurement returned, but not somuch that it severely changes the prediction. This modifiedestimate is then accepted as accurate. In other words, themeasurement update serves to correct the error that may existin the prediction [6]. This filter does not rely on the historyof all previous locations and motion vectors, but instead usesonly the prior location and motion vectors to calculate theposterior location. As a result, it can be run in real-time on asystem [7].

The basic Kalman filter is only effective on linear systems,however. There exist variants of the basic Kalman filter toaddress nonlinear systems. In such systems, either the processmodel or the observational model (or both) is nonlinear.One solution to this problem would be to approximate andlinearize the model, then apply the filter. This is how theExtended Kalman Filter, or EKF, works. To do this, it usesa Taylor approximation of the mean, which results in a linearfunction [6]. However, because the Gaussian random variableis propagated through the linearization, errors are introducedand the EKF becomes less effective and therefore less optimal.To remedy this, the Unscented Kalman Filter, or UKF wasdeveloped. This uses a minimal number of sample pointsto more accurately capture the mean and covariance of theGaussian random variable. Computationally, the UKF is ofequal order of computational complexity as the EKF, yet it ismuch more accurate [16].

In addition to the Kalman filter variants, the Rao-Blackwellized particle filter (RBPF) is also commonly usedin SLAM systems. Rao-Blackwellized particle filter estimatethe posterior over all the potential trajectoires the robot cantake. This posterior is then utilized to compute the posteriorover maps and trajectories [9].

B. Simultaneous Localization and Mapping

GPS is a convenient way of determining ones location,but not all environments are GPS-accessible. Underwater andindoor systems do not have this luxury and therefore need adifferent method of tracking their own location. SimultaneousLocalization and Mapping (SLAM) algorithms allow suchsystems to map out their environments using sensor data,then track their location within the map of their environment.However, sensors can be inaccurate, and the system has noway to pinpoint its location for certain. Instead, the systemuses its sensors to obtain a probability of whether or notit is actually located where it believes it is located. SLAM

Page 3: Simplistic Sonar based SLAM for low-cost Unmanned … Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North

algorithms typically identify landmarks with which to travelin relation to. These allow the system to travel and track itslocation by calculating a probability of its location [7].

SLAM algorithms often combine information from a varietyof sensors. Three main types of systems are visual, laser-based,and sonar-based systems.

C. Visual SLAM

A growing field in SLAM research is in visual-basedsensors, which are usually cameras mounted on the system.SLAM has utilized other sensors in the past with a highdegree of accuracy. Range sensors however have the distinctproblem that they traditionally only measure on a single plane.They also make certain assumptions about the environment.Since these assumptions did not always hold true in realenvironments, several visual SLAM, or VSLAM, solutionswere developed to counteract these [1], [3], [6], [7], [8] ,[11],[12]. These solutions provide the system with the ability torecognize objects, and thus, many rely on landmark basednavigation. Landmark-based navigation involves identifyingstatic environment variables in which to base localization andnavigation on. The systems distances from these landmarksplay a role in determining how it interacts with its envi-ronment. However, landmark identification suffers from theinherent problem of noise. The generation of noise in sensorswhich can return unreliable or inaccurate data is a steadfastproblem in SLAM. As a result, SLAM algorithms generallyneed to use probabilistic models to determine the probabilitythat the system is where its sensors say it is. While VSLAMprovides new ways into exploration of unknown environmentsit is limited in regards to the fact poor environment conditionscan have a direct effect on the ability of the robot to operateeffectively and safely. When in low light conditions, certainfeatures which are used as landmarks can be hard to seeor identify, rendering the robot with inheriently noisy data.Additionally VSLAM can be computationally intensive for thesystem. As such, many UAVs which utlize VSLAM run pathplanning and landmark identify algorithms on a ground stationwhich has significantly more processing power.

D. Laser-based SLAM

There are several examples of navigation by laser scanners[9], [10]. These scanners tend to be 2-dimensional scanners,which can create depth maps, although simple 1-dimensionalrange finders also exist. The depth maps produced by 2-dimensional scanners can be used to determine distance andorientation. However, some sort of filter is still required to dealwith the inherent noise created by these sensors. The Kalmanfilter variants and the Rao-Blackwellized particle filter arecommonly used for this purpose. Another issue lies in the factthat most approaches assume the motion of the robot is on asingle plane, which is not true in aerial systems. However thesekinds of scanners have successfully been implemented in [10].In their system, however, data was received from the aerialsystem via a wireless connection, and the computationallyintensive algorithms were run on a laptop.

E. Sonar-based SLAM

An inherent problem with laser range finders is their weightand cost. An alternative would be to use sonar sensors. Aswith the laser range finders, sonar sensors also come in1-dimensional and 2-dimensional variants. Advanced sonar-based systems operate similarly to those which utilize laserrange finders. Many advanced sonar sensors, which tend tobe 2-dimensional, can also return depth maps of their sur-roundings. As with the laser-created depth maps, an extendedKalman filter or a Rao-Blackwellized particle filter can be usedto clean up any noise inherently present in the data. However,sonar is less accurate than laser- or visual-based systems. Asa result, they are typically only used in conjunction with othersensors. The most common application of sonar sensors aresingle-dimension sonar range finders which return the straight-line distance from the closest object in the sonars path. Thesekinds of sonar sensors are integrated to help increase the sta-bility and collision avoidance ability of robotics, particularlyairborne systems [3]. However, sonar has also been adaptedfor water-based systems such as [14].

III. ARCHITECTURE

Our goal was to build a lightweight, low-cost systemcapable of autonomous flight. To accomplish this, we neededa stable platform on which to test our approaches. For thispurpose, we opted to utilize the Parrot AR.Drone 2.0. Thisquadcopter is capable of stable flight with a raw battery life ofapproximately 15 minutes. It creates its own wireless network,through which commands and data can be sent and received.This data includes navigation data such as the control state,battery level, x-velocity, y-velocity, z-velocity, and altitude.Furthermore, it is an open system with which software canfreely be integrated to communicate with the AR.Drone toretrieve that data and to send commands to be executed.After much experimentation with other systems, we ultimatelydecided to use the AR.Drones SDK, which is provided forgame development with the quadcopter.

For our sonar sensors, we chose to utilize Maxbotix LV-MaxSonar-EZ-1 High Performance Sonar Range Finders,which are capable of taking pulse-width readings. Each sonaroperates at approximately 20 Hz. To integrate a total of fivesonar sensors (top, back, left, right, front), we chose to utilizea Raspberry Pi. The Raspberry Pi has eight general-purposeinput-output (GPIO) pins, which simplified our setup. We wereable to connect the sensors to these input-output pins to takereadings via pulse width modulation. To accommodate for thetime to take one reading from each of the five sonar sensors(approximately 250ms), we implemented a check to constantlycheck for a variety of state changes rapidly and mark the timeof those changes. This allows our to check our sonar at a morerapid pace (approximately 50-60 ms).

Finally, we wanted to include a laptop ground station in oursystem for the purpose of emergency control only. Our systemis intended to be capable of localized autonomous flight.All commands should be sent from the Raspberry Pi, whichis attached to the quadcopter, to the quadcopter itself. This

Page 4: Simplistic Sonar based SLAM for low-cost Unmanned … Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North

Fig. 2. The system overview of our diagnostic program

lessens the impact of distance or interference from obstacleson the wireless connection, because the controller is movingwith the vehicle. However, the user should still be able totake control of the quadcopter if needed. This is useful fortesting purposes, but also for future deployment, as it allowsthe user to stop the quadcopter in the case of an emergency.The user should be able to take control of the flight at anypoint if desired. He or she should also be able to commandthe quadcopter to land or return to the starting point at anytime during the mission. For this purpose, we decided to usetwo XBee RF modules with USB adapters. Our Model BRaspberry Pi includes two USB ports, allowing us to plugin an XBee module and a wireless adapter for communicationwith the ground station and the quadcopter respectively.

We chose to write our system navigation program in C++since we could utilize C to effectively program our GPIOpins for pulse width modulation as well as for sending andreceiving UDP packets. This was necessary in order to connectthe 5-sonar sensor array to our AR.Drone. We will go intomore detail about the system implementation in a later section.

Our system hierarchy was broken into two major sections:diagnostics and the navigation system. In the diagnosticsprogram, we wanted to initialize every component that would

Fig. 3. Sonar readings implementation

Page 5: Simplistic Sonar based SLAM for low-cost Unmanned … Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North

Fig. 4. Controller implementaton

be necessary for flying the UAV. This included initializing oursonars and ensuring we had an active XBee connection withour ground station. The purpose of the latter was to be ableto override the system at any given point, granting the usermanual control of the system. To initialize the sonar, we setthe GPIO pins

as input, since the sonar begin sending out signals afterthey receive power. When a signal is sent out, the GPIOis set to high. Upon receiving the signal back, the GPIOpin is set to low. After some time, the pin is set back tohigh indicating another pulse has been sent out. The time inmicroseconds between when the pulse goes high and whenit goes low is returned. Utilizing the EZ-1 sonars, every 147microseconds equates to 1 inch. This information was usedfor the diagnostic test. If, upon initialization, the top sonarreceived a measurement indicating that the overhead distancewas less than 1.5 meters, the function would return false,indicating that the quadcopter does not have enough roomfor takeoff. The XBee diagnostic check involved receiving aninit string from the ground station to ensure that messagescould be received. The XBee module looks for bookmarks oneither side of any string it receives. These bookmarkst can beset to any two chracters the user desires using the structureprovided. If the wrong message was received, or if no messagewas received at all, the Xbee check would return false. Uponreceiving false from any of the checks, the program will exitto prevent launch. This ensures that the system is always fully

functional and safe before attempting to fly a mission.After receiving clearance for takeoff from the diagnostic

function, the ground station is prompted with a request tolaunch. Should the launch command be sent, the diagnosticprogram will launch the controller. At this point in time thecontroller will begin its automated navigation protocol. Thenavigation protocol currently takes three readings from eachsensor and takes the median. This early attempt is just a simplemeans to handle some of the noise received by the sensors. Thecontroller then checks each of these measurements to see ifthe quadcopter has moved into an undesired location too closeto a wall or object. If an object is detected, the quadcopter willstop its forward movement and move in the direction oppositeof the sensor which because less than its allocated buffer. Uponreading a safe distance once again, the Controller will continueits loop and begin moving forward again until another sensorraises a warning to take some action. Further implementation isnecessary and more protocols should be developed in order toallow the quadcopter to navigate autonomously. One possibleimplementation being discussed is the left hand rule, where ifpresented the option to go in multiple directions, we go left.

Fig. 5. Raspberry Pi

A. ROS

The Robotic Operating System was initially consideredfor this project, but in the end was not an effective toolfor interfacing with the AR.Drone. It turned out to be verydifficult to get ROS running on the limitations imposed by theRaspberry Pi, particularly with older versions of ROS, whichwere not compatible. Furthermore, most packages which couldhave been utilized for communications between a computerand AR.Drone were not compatible with the most up-to-dateversion of ROS, ROS Groovy. As a result, we switched to aNode.js package.

B. Node.js

Node.js was initially investigated as a suitable interfacefrom the Raspberry Pi and AR.Drone as a quick and easymeans for executing commands and receiving data. Integratingan interface with Node.js however proved to be too much

Page 6: Simplistic Sonar based SLAM for low-cost Unmanned … Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North

trouble. When utilizing Node.js you would have to fork aprocess in order to launch Node.js, which we did with asystem call. Following this, it was necessary to ensure theprocess was properly running, so we would have to pass ita command and compare the output with what should bereturn. Node.js however would throw errors if the pipe wasclosed prior to writing or reading completion and crash theprogram. To counteract this we opted to send TCP packets toNode.js and set up a server/client relationship. This proved tobe much more effective. However, due to the natural speedlimitations of package we utilized, we could not effectivelybuild a real-time system. The execution time for commandswas significant, and resulted in memory issues as commandswere queued to be executed. In the end it was more effectiveand efficient to directly send commands via UDP Packets tothe AR.Drone.

C. UDP Packets

It was determined that the best way to interact with theAR.Drone would be to send it UDP packets, which it cannatively handle. The quadcopter produces its own wirelessnetwork and accepts UDP packets on 192.168.1.1, port 5556.Therefore, our program was modified to operate by sendingUDP Packets to the AR.Drone. These packets which are sentin the form of AT* commands are executed immediately bythe AR.Drone. The AT* commands are sent in the formof 8-bit ASCII characters which have a line feed characterat the end. They contain the state of the command to beexecuted, which is set with a value between -1 and 1. TheAR.Drone also natively supports the return of navigation dataover 192.168.1.1, port 5554. This data, which is returned afterthe quadcopter receives the AT* command asking for theinformation, continuously comes in as a stream. This data hasto be parsed in order for it to be used by the system, as allnavdata demo information is returned in a stream of 500 bits.Much of this data is deprecated and no longer used. We storedeach piece of information individually in a message structure,which can be passed anywhere in the program via accessormethods. These packets are the lifeline between the RaspberryPi and the AR.Drone. The different types of messages whichcan be sent to the AR.Drone are summarized in Figure 6.

D. Filtering and Data

In addition to the two-part filtering that is needed forSLAM algorithms to work, the raw sonar data needs to befiltered as well. As mentioned previously, sonar sensors areless expensive than laser sensors, but they produce more noisethan laser sensors. Due to the speed limitation of our sonarsensors, we are limited in the number of readings we can usefor real-time filtering. For simplicity, we started with a basicmedian filter, which returned a median of the last few readingsreceived. Currently, this takes three readings, as taking moreresults in more time before a noticeable change in distanceis detected. Additionally, with such a small dataset, extensivenoise can result in large numbers of outliers, which can stillget through the filter. A key property of a simple median filter

Fig. 6. AT Commands

is that it will lag behind the data it is filtering. As a result, themedian filter is not an optimal solution for real-time filtering,especially with such a small dataset, but it is sufficient as astarting solution to filter out some of the outliers.

We can also receive navigation data from the quadcopter,which includes velocity in three axes, heading, and altitude.This information, while useful in-flight, is even more criticalfor localization and mapping of the quadcopters environment.As previously mentioned, the Kalman filter variants consist oftwo parts: the process model and the observation model. Theprocess model uses previous location and velocity to predictwhere the vehicle is at any point in time. We can pull thisdata from the navigation data received from the quadcopter.Our sonar sensors and the altitude reading from the quadcopterprovide information needed for the measurement update. How-ever, this information can also be used to further stabilize thequadcopter in small spaces. Though it is relatively stable, inconfined spaces, it can be affected by the wind it generatesitself moving off the surrounding environment. This, combinedwith the added weight of the Raspberry Pi and sonar array,can potentially cause the quadcopter to drift into a wall. Thesonar sensor readings can be used here as well to improveflight safety of the equipment. By detecting and correcting driftusing sonar readings and heading changes from the navigationdata, we can allow for a more stable flight and avoid collisionswith nearby obstacles. This issue is discussed in more detaillater.

IV. ISSUES FACING ANY CONTROLLER PLUGGED INTO THESYSTEM

There are several factors which must be taken into consid-eration when developing a controller for this platform. These

Page 7: Simplistic Sonar based SLAM for low-cost Unmanned … Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North

factors include drift, packet loss, and sonar noise.

A. Sonar Noise

Sonar noise has been discussed in detail, and a filter mustbe included to help clean up any inconsistencies. However,it should be reiterated that any controller loop must be ableto handle inaccurate information which finds its way past thefilter. A common problem observed in the system is the sonarreporting a distance to an obstacle that is closer than the truedistance to the obstacle. The sensors typically underestimatetheir distance by around a foot when moving. This increasein noise could be the result of various individual factors or acombination of them. These factors include a moving platformand increased vibrations from the motors, which may alsocontribute to increased noise. Therefore, any controller loopmust be able to handle and recover from false readings withoutblocking movement in the direction these readings came fromshould a false measurement be provided

Fig. 7. Sonar readings, the red line is the data which would be returnedusing a median filter

B. Packet Loss

Packet loss is a problem that is faced whenever you aresending and receiving UDP packets. However it does posesome problems for any real time system which is relies onthe accuracy of the information provided from the quadcopter.When sending commands to the AR.Drone you have toprovide each command with a sequence number. This numberis utilized to help keep track of each message the dronereceiving. It has an internal counter, which increments witheach packet received. Should a packet with an id which islower than the previous commands id be received, the packetshall be ignored. Likewise, and corrupted packets whosemessage cannot be decoded by the AR.Drone will be ignored.It is recommended that commands be sent multiple times toensure that they are received by the AR.Drone. Should thedrone receive a command which it is already executing, itwill simply continue its execution. However, if it were toreceive, for instance a go left command, while it was executinga forward movement command, the drone will execute bothcommands simultaneously, and move along the vector which

is proportional to the forward and leftward speed it was toldto move along. If it were to receive a hover command, allmovement commands are set to zero. Therefore, commandswhich are critical (such as hover) should be issued more thanonce to ensure the packet is received by the AR.Drone.

C. Drift

Drifting is an issue which must also be identified andtaken into consideration by any controlling algorithm which isimplemented for this system. The AR.Drone stabilizes itselfvia a downward facing camera which locks onto a contrastingimage, such as a dot on the ground. It attempts to keep itstarget image in the same location at which it first identifiedit. This however, can be a problem when the camera cannotidentify a significant feature to lock onto. In such situationsthe AR.Drone can exhibit significant drift as it searches for atarget to lock onto.

Another problematic source of drift is backwash, producedby the quadcopters propellers. When all components are addedto the AR.Drone, it flies just under its targeted height ofone meter. However, at this height the downward force ofthe wind bouncing off the wall and floor, compounded withthe weight of the quadcopters payload, will produce enoughturbulence to cause the AR.Drone to drift in a variety ofways. Thus, a controller algorithm must take into considerationthe possibility that in locations where there is not a lot ofclearance to either side, such as a narrow hallway, backwashwhen hovering is an issue.

V. FUTURE WORK

This platform is effectively ready for use in various pathplanning and SLAM algorithms. However, there remains sev-eral key features which should be considered as a future av-enue to increase the effectiveness of this system as a platform.These include implementing emergency control commandswhich will allow for a ground station to take over and providemanual navigation in an emergency situation. Currently thesystem only provides for an emergency land sequence to beexecuted if it receives any message over the XBee connection.

Additionally, a better method for filtering out noise mightbe explored. The current implementation only considers themedian of the the past three measurements. This was doneas a quick way to try to remove significant outlying valueswhich might result from inaccuracies or noise in the sonarreadings. This implementation, however, lags behind the databy approximately 60ms (the time of the loop execution). Ifthe drone detects an object in front of it, since the distanceswill be decreasing as it approaches, it will require two accuratereadings in order to achieve the median required to execute astop command. This is a problem with the way a pure medianfilter works. A filter which favors real-time systems might beimplemented or added to improve this feature.

In the XBee communication, messages should be forwardedback to the user. Currently they are printed out to the consoleof the Raspberry Pi. The speed of our sonar readings couldalso be possible increased by using some sort of interrupt when

Page 8: Simplistic Sonar based SLAM for low-cost Unmanned … Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North

the pin state changes, thus allowing for us to execute othercommands while the sonar sensors take their measurements.However, the time of execution will not speed up greatly.There will only be a slight increase in performance, since thecurrent model of sonar we use can only operate at 20 Hz atits fastest.

After these features have been implemented, a SLAMalgorithm which operates on simple range data information,which has low overhead, could be implemented and utilized bythis system. This, combined with a loop-closing, path-planningalgorithm, could provide a completely self-contained system.Ideally, such a system could operate autonomously, withoutany user input after start up. The applications of such a systemare numerous.

Fig. 8. Our setup with the Pi and AR Drone

VI. CONCLUSION

In the growing field of unmanned aerial flight, maneuveringin confined or indoor spaces is becoming increasingly popular.Such spaces present a particular difficulty in the area ofautonomous flight. Obstacles tend to be closer and morefrequent in confined spaces, and indoor spaces also requirethe vehicle to determine its location without a GPS signal.For such spaces, helicopter-based systems are a significantimprovement over fixed-wing aircraft frequently used outdoorsdue to their ability to hover in place and change direction eas-ily. Such vehicles can be used for a variety of tasks, includingmapping and search-and-rescue. However, such systems canprove expensive. Current aerial systems typically use visualor laser-based SLAM algorithms to map the environment andlocalize the UAV. Laser-based payloads can be heavy, andboth visual and laser systems often require a large amountof computing power. These systems are rarely self-containedas a result, and instead require a connection to a ground stationfrom which commands can be sent.

As such, our goal was to set up and develop a relativelylow-cost, self-contained platform that could navigate and mapan indoor environment autonomously. For this, we chose a

commercial quadcopter capable of stable flight and simple,one-directional sonar range finders. We used a Raspberry Pi,an affordable microcomputer, for integrating these componentsto create a contained system. This eliminated the need for aground station to handle computations returned by the sensors.

The system we created provides a solid platform for theincorporation of SLAM and other path planning algorithmsdiscussed in the literature review section. This system providesfast access to data available via the AR.Drone. We furtherprovide access to filtered sonar distance data in every directionexcept for below, as this is accessible in the AR.Dronenavigation data as altitude. This platform has the ability foran algorithm to be plugged into the controller loop with ease.Combined with the overall diagnostic check, this provides aself-contained system with the ability to easily adapt to plug-ins; offline computing resources are not needed for navigationcontrol. Traditional aerial SLAM platforms require laser rangefinders or utilize visual SLAM techniques. As a result, theyare not ideal for a self-contained system, whose computingpower is always contained within the drones immediate area.However, any controlling algorithm must take into considera-tion the inconsistencies in the quadcopters behavior mentionedabove. If this is done, the proposed system has the ability toprovide the user with all the data required for safe operation,as well as some fail-safes which can protect the system priorto takeoff.

ACKNOWLEDGMENT

The authors would like to thank Auburn University. Wewould also like to thank all those involved in the lab whichincludes all the REU students from the 2013 program. Addi-tionally the help of Jagadeesh Balasubramani and Kang Sunwas greatly appreciated.

REFERENCES

[1] Arturo Gil, scar Reinoso, Mnica Ballesta, Miguel Juli, Multi-robotvisual SLAM using a Rao-Blackwellized particle filter, Robotics andAutonomous Systems, Volume 58, Issue 1, 31 January 2010, Pages 68-80

[2] Ax, Markus; Kuhnert, Lars; Langer, Matthias; Schlemper, Jens; Kuhnert,Klaus-Dieter, ”Architecture of an autonomous mini unmanned aerialvehicle based on a commercial platform,” Robotics (ISR), 2010 41st In-ternational Symposium on and 2010 6th German Conference on Robotics(ROBOTIK) , vol., no., pp.1,6, 7-9 June 2010

[3] Bills, C.; Chen, J.; Saxena, A., ”Autonomous MAV flight in indoor envi-ronments using single image perspective cues,” Robotics and Automation(ICRA), 2011 IEEE International Conference on , vol., no., pp.5776,5783,9-13 May 2011

[4] Bristeau, Pierre-Jean, Franois Callou, David Vissire, and Nicolas Petit.”The Navigation and Control technology inside the AR. Drone microUAV.” In World Congress, vol. 18, no. 1, pp. 1477-1484. 2011.

[5] Chong, Kok Seng, and Lindsay Kleeman. ”Feature-based mapping in real,large scale environments using an ultrasonic array.” The InternationalJournal of Robotics Research 18, no. 1 (1999): 3-19.

[6] Dijkshoorn, Nick. ”Simultaneous localization and mapping with the ar.drone.” PhD diss., Masters thesis, Universiteit van Amsterdam, 2012.

[7] Durrant-Whyte, H.; Bailey, Tim, ”Simultaneous localization and map-ping: part I,” Robotics and Automation Magazine, IEEE , vol.13, no.2,pp.99,110, June 2006

[8] Ghadiok, Vaibhav; Goldin, Jeremy; Wei Ren, ”Autonomous indoor aerialgripping using a quadrotor,”Intelligent Robots and Systems (IROS), 2011IEEE/RSJ International Conference on , vol., no., pp.4645,4651, 25-30Sept. 2011

Page 9: Simplistic Sonar based SLAM for low-cost Unmanned … Sonar based SLAM for low-cost Unmanned Aerial Quadrocopter systems Christopher R. Hudson Computer Science University of North

[9] Grisetti, G.; Stachniss, C.; Burgard, W., ”Improving Grid-based SLAMwith Rao-Blackwellized Particle Filters by Adaptive Proposals andSelective Resampling,” Robotics and Automation, 2005. ICRA 2005.Proceedings of the 2005 IEEE International Conference on , vol., no.,pp.2432,2437, 18-22 April 200

[10] Grzonka, S.; Grisetti, G.; Burgard, W., ”A Fully Autonomous IndoorQuadrotor,” Robotics, IEEE Transactions on , vol.28, no.1, pp.90,100,Feb. 2012

[11] Jong-Hyuk Kim; Sukkarieh, S., ”Airborne simultaneous localisation andmap building,” Robotics and Automation, 2003. Proceedings. ICRA ’03.IEEE International Conference on , vol.1, no., pp.406,411 vol.1, 14-19Sept. 2003

[12] Robert Sim, James J. Little, Autonomous vision-based robotic explo-ration and mapping using hybrid maps and particle filters, Image andVision Computing, Volume 27, Issues 12, 1 January 2009, Pages 167-177, ISSN 0262-8856

[13] Simo Srkk, Aki Vehtari, Jouko Lampinen, Rao-Blackwellized particlefilter for multiple target tracking, Information Fusion, Volume 8, Issue 1,January 2007, Pages 2-15

[14] Ribas, D.; Ridao, P.; Neira, J.; Tardos, J.D., ”SLAM using an ImagingSonar for Partially Structured Underwater Environments,” IntelligentRobots and Systems, 2006 IEEE/RSJ International Conference on , vol.,no., pp.5040,5045, 9-15 Oct. 2006

[15] Welch, G., Bishop, G. (1995). An introduction to the Kalman filter[16] Wan, E.A.; Van der Merwe, R., ”The unscented Kalman filter for

nonlinear estimation,” Adaptive Systems for Signal Processing, Commu-nications, and Control Symposium 2000. AS-SPCC. The IEEE 2000 ,vol., no., pp.153,158, 2000

[17] Bailey, Tim; Durrant-Whyte, H., ”Simultaneous localization and map-ping (SLAM): part II,” Robotics and Automation Magazine, IEEE ,vol.13, no.3, pp.108,117, Sept. 2006

Fig. 9. AR.Drone hovering

Fig. 10. AR.Drone moving

Fig. 11. Another view of the AR.Drone

Fig. 12. Front view of the AR.Drone