low cost frame work for parameter identification of unmanned aerial vehicles

102
LOW COST FRAMEWORK FOR PARAMETER IDENTIFICATION OF UNMANNED AERIAL VEHICLES By Ahmad Adel Ahmad Metwalli Khattab A Thesis Submitted to the Faculty of Engineeringat Cairo University in Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE in Aerospace Engineering FACULTY OF ENGINEERING, CAIRO UNIVERSITY GIZA, EGYPT 2015

Upload: -

Post on 03-Dec-2015

31 views

Category:

Documents


22 download

DESCRIPTION

uavneural networkparameter identification arduino

TRANSCRIPT

Page 1: low cost frame work for parameter identification of unmanned aerial vehicles

LOW COST FRAMEWORK FOR PARAMETERIDENTIFICATION OF UNMANNED AERIAL

VEHICLES

By

Ahmad Adel Ahmad Metwalli Khattab

A Thesis Submitted to theFaculty of Engineeringat Cairo University

in Partial Fulfillment of theRequirements for the Degree of

MASTER OF SCIENCEin

Aerospace Engineering

FACULTY OF ENGINEERING, CAIRO UNIVERSITYGIZA, EGYPT

2015

Page 2: low cost frame work for parameter identification of unmanned aerial vehicles

LOW COST FRAMEWORK FOR PARAMETERIDENTIFICATION OF UNMANNED AERIAL

VEHICLES

By

Ahmad Adel Ahmad Metwalli Khattab

A Thesis Submitted to theFaculty of Engineeringat Cairo University

in Partial Fulfillment of theRequirements for the Degree of

MASTER OF SCIENCEin

Aerospace Engineering

Under the Supervision of

Prof.Dr. Ayman Hamdy Kassem Prof.Dr. Gamal Mahmoud Sayed ElbayoumiProfessor of flight dynamics and control Professor of flight dynamics and control

Aerospace Engineering Department Aerospace Engineering Department

Faculty of Engineering, Cairo University Faculty of Engineering,Cairo university

FACULTY OF ENGINEERING, CAIRO UNIVERSITYGIZA, EGYPT

2015

Page 3: low cost frame work for parameter identification of unmanned aerial vehicles

LOW COST FRAMEWORK FOR PARAMETERIDENTIFICATION OF UNMANNED AERIAL

VEHICLES

By

Ahmad Adel Ahmad Metwalli Khattab

A Thesis Submitted to theFaculty of Engineeringat Cairo University

in Partial Fulfillment of theRequirements for the Degree of

MASTER OF SCIENCEin

Aerospace Engineering

Approved by the Examining Committee:

Prof.Dr. Ayman Hamdy Kassem, Thesis Main Advisor

Prof.Dr. Gamal Mahmoud Sayed Elbayoumi, Thesis Co Advisor

Prof.Dr. Mohamed Nader Mohamed Abuelfoutouh, Internal Examiner

Prof.Dr.Omar Al-Farouk Abd El-hameid , External ExaminerMilitary Technical College

FACULTY OF ENGINEERING, CAIRO UNIVERSITYGIZA, EGYPT

2015

Page 4: low cost frame work for parameter identification of unmanned aerial vehicles

Engineer’s Name: Ahmad Adel Ahmad Metwalli KhattabDate of Birth: 1/10/1989Nationality: EgyptianE-mail: [email protected]: 00201020549948Address: 239L Hadaek AlahramRegistration Date: 1/10/2011Awarding Date: / /2015Degree: Master of ScienceDepartment: Aerospace Engineering

Supervisors:Prof.Dr. Ayman Hamdy KassemProf.Dr. Gamal Mahmoud Sayed Elbayoumi

Examiners:Prof.Dr. Ayman Hamdy Kassem (Thesis main advisor)Prof.Dr. Gamal Mahmoud Sayed Elbayoumi (Thesis Co Advisor)Prof.Dr. Mohamed Nader Mohamed Abuelfoutouh (Internal examiner)Prof.Dr.Omar Al-Farouk Abd El-hameid (External examiner)Military Technical College

Title of Thesis:

LOW COST FRAMEWORK FOR PARAMETER IDENTIFICATION OFUNMANNED AERIAL VEHICLES

Key Words:

Unmanned Aerial Vehicles;Parameter identification; Autopilot; Hopfield;Recuurent neural network

Summary:In this thesis a framework was designed and tested to identify the parametersof a unmanned aerial vehicles.The framework was designed to be low in costand to be accessible to be modified and customized. Tests were done on twosmall remotely piloted RC models. Offline Parameter identification was doneusing Hopfield neural network.

Page 5: low cost frame work for parameter identification of unmanned aerial vehicles

AcknowledgementsThanks to Allah for all his gifts to me.I want to thank my supervisors for guiding me

through the work. I would like to say thank you to my parents for their support to methrough all my life.I would like to thank my wife for her support to me. I would like tothank all ASTL people for their help to me through my thesis work especially: Eng.OsamaSaied, Eng.Mohaned Draz , Eng.Mustafa Moharm .Thanks to all students I worked withthrough their graduation projects for their accumulated work that increased the knowledgeand transfered it through years. I also would like to thank all my undergraduate professorsin Aerospace Engineering Department, Cairo University.

i

Page 6: low cost frame work for parameter identification of unmanned aerial vehicles

DedicationTo My family .... To My grand father Haj. Ahmad Khattab

ii

Page 7: low cost frame work for parameter identification of unmanned aerial vehicles

Contents

Acknowledgements i

Dedication ii

List of Figures v

List of Tables vii

List of Symbols and Abbreviations viii

Abstract 1

1 Introduction 2

2 Literature Review 52.1 Parameter Identification Techniques . . . . . . . . . . . . . . . . . . . . 52.2 Work achieved in Aerospace department . . . . . . . . . . . . . . . . . . 8

3 Parameter Identification 113.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113.2 Artificial neural network . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113.2.2 Hopfield neural network . . . . . . . . . . . . . . . . . . . . . . 12

3.2.2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . 123.2.2.2 Parameter Estimation Problem . . . . . . . . . . . . . 14

3.2.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

4 Data Gathering Module 224.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224.2 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

4.2.1 Microcontroller . . . . . . . . . . . . . . . . . . . . . . . . . . . 244.2.2 Microcontroller shield . . . . . . . . . . . . . . . . . . . . . . . 244.2.3 Servo Motors . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254.2.4 Servos and potentiometer Sets . . . . . . . . . . . . . . . . . . . 264.2.5 Potentiometers . . . . . . . . . . . . . . . . . . . . . . . . . . . 264.2.6 IMU . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264.2.7 Pressure Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . 274.2.8 Velocity sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . 274.2.9 Wireless module . . . . . . . . . . . . . . . . . . . . . . . . . . 27

iii

Page 8: low cost frame work for parameter identification of unmanned aerial vehicles

4.2.10 GPS module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274.2.11 Alpha-Beta sensor . . . . . . . . . . . . . . . . . . . . . . . . . 284.2.12 Batteries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314.2.13 RC Transmitter and receiver . . . . . . . . . . . . . . . . . . . . 31

4.3 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314.3.1 Arduino Mega Software . . . . . . . . . . . . . . . . . . . . . . 314.3.2 Monitoring Software . . . . . . . . . . . . . . . . . . . . . . . . 324.3.3 Parameter Identification Software . . . . . . . . . . . . . . . . . 33

5 Test flights and Results 355.1 KADET LT-40 RC model Test Flight . . . . . . . . . . . . . . . . . . . . 35

5.1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355.1.2 PI of Longitudinal Dynamics . . . . . . . . . . . . . . . . . . . . 365.1.3 PI of Lateral Dynamics . . . . . . . . . . . . . . . . . . . . . . . 405.1.4 Test Summary and recommendation . . . . . . . . . . . . . . . . 42

5.2 Low Super Trainer RC model . . . . . . . . . . . . . . . . . . . . . . . . 435.2.1 Modifications for Test 2 . . . . . . . . . . . . . . . . . . . . . . . 435.2.2 Flight test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 485.2.3 Longitudinal Dynamics Identification . . . . . . . . . . . . . . . 485.2.4 Lateral Dynamics Identification . . . . . . . . . . . . . . . . . . 50

6 Conclusion and Future Work 536.1 The Future suggested work . . . . . . . . . . . . . . . . . . . . . . . . . 53

6.1.1 Parameter Identification . . . . . . . . . . . . . . . . . . . . . . . 536.1.1.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . 536.1.1.2 Software . . . . . . . . . . . . . . . . . . . . . . . . . 53

6.1.2 Autopilot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

References 55

Appendix A Hopfield neural network matlab code 56

Appendix B ArduImu V3 Codes 66

Appendix C Arduino Mega code 69

Appendix D Simulink GCS 80

iv

Page 9: low cost frame work for parameter identification of unmanned aerial vehicles

List of Figures

1.1 Autopilot Sequences . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2.1 Simplified block diagram of the estimation procedure[6] . . . . . . . . . 52.2 A sample of the used method for PI . . . . . . . . . . . . . . . . . . . . . 10

3.1 An artificial neuron as used in a Hopfield network . . . . . . . . . . . . . 133.2 An artificial neuron as used in a Hopfield network with added sigmoid

function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133.3 Example for Discrete image recognition using Hopfield Neural Network . 143.4 Flow chart for Hopfield Neural Network PI . . . . . . . . . . . . . . . . 163.5 Real,measured and Predicted data for states and states rate . . . . . . . . 183.6 Real,measured and Predicted data for a sample state . . . . . . . . . . . . 193.7 True and estimated parameters . . . . . . . . . . . . . . . . . . . . . . . 203.8 Error in parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

4.1 Block Diagram for the whole system processes . . . . . . . . . . . . . . 234.2 Data gathering Module Hardware . . . . . . . . . . . . . . . . . . . . . . 244.3 Arduino mega 2650 microcontroller . . . . . . . . . . . . . . . . . . . . 244.4 Arduino Customized shield . . . . . . . . . . . . . . . . . . . . . . . . . 254.5 Servo Motor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254.6 Servos Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264.7 Potentiometer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264.8 Arduimu V3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264.9 BMP085 Module GY-65 Pressure Sensor . . . . . . . . . . . . . . . . . . 274.10 Velocity Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274.11 Xbee pro 52B wireless module . . . . . . . . . . . . . . . . . . . . . . . 274.12 GPS Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284.13 First Design (Not used because of middle flow from motor propeller) . . . 284.14 Air data fin drown by Solid Works . . . . . . . . . . . . . . . . . . . . . 294.15 CNC manufacturing of Air data fin . . . . . . . . . . . . . . . . . . . . . 294.16 Air data fin for Side Slip angle . . . . . . . . . . . . . . . . . . . . . . . 304.17 Air data fin for angle of attack . . . . . . . . . . . . . . . . . . . . . . . 304.18 The total AOA and Side slip sensors attached to the wing tip . . . . . . . 314.19 Flow Chart of Arduino Software . . . . . . . . . . . . . . . . . . . . . . 324.20 Monitoring Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324.21 Simulink GCS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334.22 Flow Chart of Simulink GCS . . . . . . . . . . . . . . . . . . . . . . . . 334.23 Flow chart for Hopfield Neural Network PI . . . . . . . . . . . . . . . . 34

v

Page 10: low cost frame work for parameter identification of unmanned aerial vehicles

5.1 The Flight Path . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355.2 Ready to Fly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 365.3 Captured photo from the flight test . . . . . . . . . . . . . . . . . . . . . 365.4 Longitudinal States (Measured and estimated).The yellow line is the ele-

vator signal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385.5 Longitudinal States (Measured and estimated) when using elevator and

change in elevator as inputs.The yellow line is the elevator signal . . . . . 395.6 Estimated Pole map of the Longitudinal dynamics . . . . . . . . . . . . 405.7 Lateral States (Measured and estimated).The last line is the aileron signal 415.8 Estimated Pole Map of the Lateral dynamics . . . . . . . . . . . . . . . 425.9 Low Super Trainer RC model . . . . . . . . . . . . . . . . . . . . . . . 435.10 Replacing the engine with an electric motor . . . . . . . . . . . . . . . . 445.11 The fin of the old sensor and the smaller one of the new sensor . . . . . . 455.12 The black potentiometer and the golden Encoder . . . . . . . . . . . . . 455.13 The modified Air data system . . . . . . . . . . . . . . . . . . . . . . . . 465.14 The Air data systems setup for both aircrfats . . . . . . . . . . . . . . . 465.15 The Air data systems of both two aircrfats . . . . . . . . . . . . . . . . . 475.16 The two sets of the new Air data system . . . . . . . . . . . . . . . . . . 475.17 The two velocity sensors data during the flight . . . . . . . . . . . . . . 485.18 The two beta sensors data during the flight . . . . . . . . . . . . . . . . . 485.19 Doublet elevator input . . . . . . . . . . . . . . . . . . . . . . . . . . . 495.20 Measured and Estimated Longitudinal States . . . . . . . . . . . . . . . 505.21 Longitudinal Pole Map . . . . . . . . . . . . . . . . . . . . . . . . . . . 505.22 Lateral States , Last line for Aileron . . . . . . . . . . . . . . . . . . . . 515.23 Lateral Pole Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

6.1 Suggested Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

vi

Page 11: low cost frame work for parameter identification of unmanned aerial vehicles

List of Tables

3.1 Estimated elements of A and B matrices . . . . . . . . . . . . . . . . . . 173.2 Results of different combinations of data Sampling frequency and the type

of the input signal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

5.1 KADET LT-40 RC model Specifications . . . . . . . . . . . . . . . . . . 355.2 Low Super trainer RC model Specifications . . . . . . . . . . . . . . . . 44

vii

Page 12: low cost frame work for parameter identification of unmanned aerial vehicles

List of Symbols and Abbreviations

A Dynamics Matrix in State space model

B Control Matrix in State space model

u input vector for the state space model

x States of the dynamical system

r Aircraft angular velocity, z body axis (yaw rate), deg/sec

t Time, sec

u Velocity along body fixed x-axis, m/sec

v Velocity along body fixed y-axis, m/sec

w Velocity along body fixed z-axis, m/sec

B Bias Vector

W Weighting matrix

E Error

α Angle of attack, deg

β Angle of sideslip, deg

δ Control surface deflection, deg

φ Roll Euler angle, deg

θ Pitch Euler angle, deg

ψ Yaw Euler angle, deg

f sigmoid function

ax Acceleration in x-direction, m/sec2

az Acceleration in z-direction, m/sec2

g Acceleration due to gravity, m/sec2

k Discrete time index

m Aircraft mass,kg

p Aircraft angular velocity, x body axis (roll rate), deg/sec

q Aircraft angular velocity, y body axis (pitch rate), deg/sec

viii

Page 13: low cost frame work for parameter identification of unmanned aerial vehicles

ANN Artificial Neural Network

AOA Angle of attack

AS CD Aerodynamic stability and control derivatives

AS T L Aeronatical and Space technology laboratory, Cairo University

DGM Data gathering Module

GCS Ground control station

HNN Hopfield Neural Network

IMU Inertial measurment unit

NN Neural Network

PCB Printed circuit board

PI Parameter Identification

RNN Recurrent Neural Network

S NR Signal to noise ratio

ix

Page 14: low cost frame work for parameter identification of unmanned aerial vehicles

AbstractThe rst step in the autopilot design phase is the estimation of the aerodynamic stability

and control derivatives. Parameter identication (PI) from ight data is one of the main andaccurate methods for nding aerodynamic stability and control derivatives especially forsmall and unconventional aircrafts.

In this work, we are going to introduce an off-line PI system (software and hardware)using a per-collected data from a remotely piloted RC model. The thesis is divided intotwo main parts: 1) Flight segment: including data collection sensors and telemetry. 2)Ground segment: including telemetry and parameters identication software.

KADET LT-40 RC model aircraft was rst used as a test case and the aerodynamic stabilityand control derivatives estimated by our proposed system led to an estimated responsewhich was in a good match with the measured one with an error about 7%. Again usinganother RC model Low Super Trainer RC model and after some modications to the dataGathering system and sensors a better results was obtained the error was less than 1.5%for longitudinal dynamics and 4% for lateral dynamics.

1

Page 15: low cost frame work for parameter identification of unmanned aerial vehicles

Chapter 1: IntroductionIn conventional autopilot Design, Parameter identification (PI) is a very essential phase inaircraft mathematical modeling. By means of real flight tests, Data of input and outputsignals of the aircraft could be collected for a post processing task to identify the aerody-namic stability and control derivatives of the aircraft.

The Linear state space model of the aircraft consists of two matrices, The dynamicsand control matrices. By identifying these two matrices, it is possible to build a linearmathematical model of the aircraft. Data are collected using a customized data gatheringmodule designed for this purpose.The module consists of calibrated sensors and a micro-controller to collect and send data to the ground control station to be saved for the sakeof PI in the post processing phase. The PI is done using Hopfield neural network HNNwith 24 neurons for longitudinal dynamics and another 24 neurons for the lateral dynamics.

In the following section main autopilot steps will be introduced to find where the thesiswork locate in the overall autopilot design steps. Starting from lower part in figure 1.1 it ispossible to find the main steps of autopilot design.

Figure 1.1: Autopilot Sequences

Starting from uncalibrated sensors and by adding a data acquisition tool for filtrationand calibration we have now a sensor module. Connecting between the sensor module anda microcontroller and saving the data for post processing is the DGM. Using the readingsthe next step is to use it for parameter identification.

By PI it is possible to build the appropriate mathematical model of the system. By

2

Page 16: low cost frame work for parameter identification of unmanned aerial vehicles

adding the control algorithm such as altitude hold and heading autopilot it is possible toconvert unstable aircraft into a stable one by control or to make a stable aircraft morestable in the stability control phase.

In stability control phase it is possible to say that all required change in aircraft states isequal zero i.e. you need to hold altitude at certain value or speed at constant cruise speedor Euler angles to be all zero or little changes to maintain altitude and heading. Now wecan call that aircraft can draw a point i.e it will not fall, it will not stall and it will sustainin the sky and avoid any disturbances like a standing baby who cannot till now move.

Now we convert to the phase of guidance to move the baby where the aircraft is re-quired to move from one point to another from certain longitude and latitude value toanother one.By adding a mission requirements to the stability control we reach the finalpoint in autopilot design.Ground control station GCS is modified and become more com-plicated from a step to another.

In the thesis work, we will reach only the step of PI - the offline PI phase. Thework required to build a data gathering module DGM . DGM contains calibrated sensorsconnected with a microcontroller to sense and save all aircraft data during flight i.e. inputand output signals for example elevator signal and pitch angle.

This data could be used for various applications. One may use them for PI anotherone could use them for an autopilot circuit. In the present work test flights were made on aremotely piloted RC models and a good flight sample was chosen to be used in the PI phase.

The tests were done for two RC models:KADET LT-40 and Low super trainer. TheKADET had the first tests and some modifications on the whole system was suggestedand taken into consideration in the Low super trainer test flights. Results showed that theapproach was successful with a very appealing results and open the way for future work inon line parameter identification and the stability control autopilot design based on offlineparameter identification results.

In chapter 3, the Hopfield neural network for PI is introduced describing the problem ofPI then the branch in neural network where Hopfield is located then how to use HNN inPI. The main proof of the method was introduced in Raol and others see [6] but for a twostates system with 2x2 A matrix and 2x1 B matrix. A general code for any nxn A andnxm B matrices will be introduced where n for states and m for inputs of the system. Ageneral flow chart for the method will be introduced. To validate the method, a systemwith a know state space model will be used. Input signal will applied to the system toget its states responses. A noise signals will be applied to the states and input signals.The noisy signals will be taken as input to the HNN code to get the original A,B matricesof the system. A comparison with the original and estimated systems will be introducedwhich will show an appealing results to be used further with the real flight data.

In Chapter 4, The data gathering module(DGM) will be introduced. The various compo-nents of the system will be shown and describing specifications and limitation of each

3

Page 17: low cost frame work for parameter identification of unmanned aerial vehicles

sensor.Then A full flow chart of the system will be introduced showing onboard compo-ments and ground control station(GCS) components. The varoius software used will beintroduced including the onboard software and GCS software and the PI software. Thefull detailed software codes are introduced in the appendices.

In chapter 5, The flight tests will be introduced showing the specifications of each aircraftused in flight then the flight data and cautions. The results of the first flight was introducedand discussed to show the recommendation and cautions to be taken into concentration inthe second test flight. The recommended modification will be done and introduced in thischapter. The second test flight will show a better performance and results.

In chapter 6, Conclusions and recommendation for future work will be introduced anddiscussed.

4

Page 18: low cost frame work for parameter identification of unmanned aerial vehicles

Chapter 2: Literature Review2.1 Parameter Identification TechniquesParameter Identification is a wide research subject and when relating to aircraft systems, itallows us to obtain a mathematical model of an aircraft from real flight data. These datacould get us the aerodynamic properties of an aircraft even this aircraft is in the designedflight condition or not, even the flight was a stable flight or unstable and even the geometryof the aircraft was changed through a crash or added external payloads or weights. Windtunnel experiments can be validated through real flight data and the real flights are morereliable and accurate than the restricted wind tunnels. Flight simulation for pilot trainingcould be enhanced when the simulation is close to the real flight actions and this could begrantee with the results of parameter identification techniques.

The problem of parameter estimation is based on minimization of some criterion (ofestimation error) and this criterion itself can serve as one of the means to establish theadequacy of the identified model. Figure 2.1shows the simple approach to parameter

Figure 2.1: Simplified block diagram of the estimation procedure[6]

identification. Given the z measurements of a system which is the real state response to aninput but with added noise.Input signal is applied to the aircraft and the states are thenmeasured taking into consideration that the measured states responses may have an addednoise. On the other hand, an initial model of the system is assumed and the input signal isapplied to it to get the model response which will not be equal the measured one. So theoptimization criteria is introduced and iteration occurred till reaching an accepted outputerror. The optimization criteria is the main difference between the various PI methods. Wehave to take into consideration that the least output error is not the sufficient condition forachieving good estimate so expanded versions are introduced to get the sufficient criteria.

5

Page 19: low cost frame work for parameter identification of unmanned aerial vehicles

Lots of methods were used for the problem of PI. One of them is The maximum likelihood-output error method. This method has a batch iterative procedure.This mean that in oneshot all the measurements are used and parameter corrections are obtained. Hence a newparameters are calculated and iteration occurred and so on .

The output error method has two limitations:i) it can handle only measurement noise not the process noise.ii) Divergence may occur for unstable systems

Using Kalam Filter could help in solving the process noise. This leads to a new methodcalled the filter error method.In this method, the output states are filtered before applyingthe cost function. This makes the method more complex and computationally intensive.The filter error method can compete with the extended Kalman filter, which can handleprocess as well as measurement noises and also estimate parameters as additional states.One major advantage of Kalman filter/extended Kalman filter is that it is a recursivetechnique and very suitable for on-line real-time applications. For the latter application, afactorization filter might be very promising. One major drawback of Kalman filter is thefilter tuning, for which the adaptive approaches need to be used.

The second limitation of the output error method for unstable systems can be overcome byusing the so-called stabilized output error methods, which use measured states.This stabilizes the estimation process.Alternatively, the extended Kalman filter or theextended factorization filter can be used, since it has some implicit stability property in thefiltering equation. The filter error method can be efficiently used for unstable/augmentedsystems.

Since the output error method is an iterative process, all the predicted measurementsare available and the measurement covariance matrix R can be computed in each iteration.The extended Kalman filter for parameter estimation could pose some problems sincethe covariance matrix part for the states and the parameters would be of quite differentmagnitudes. Another major limitation of the Kalman filter type approach is that it cannotdetermine the model error, although it can get good state estimates. The latter part isachieved by process noise tuning. This limitation can be overcome by using the modelerror estimation method.

The approach provides estimation of the model error, i.e., model discrepancy with respectto time. However, it cannot handle process noise. In this sense, the model error estimationcan compete with the output error method, and additionally, it can be a recursive method.However, it requires tuning like the Kalman filter. The model discrepancy needs to befitted with another model, the parameters of which can be estimated using recursive leastsquares method.

Another approach, which parallels the model error estimation, is the estimation beforemodeling approach. This approach has two steps:i) the extended Kalman filter to estimate states (and scale factors and bias related parame-ters);ii) a regression method to estimate the parameters of the state model or related model.

6

Page 20: low cost frame work for parameter identification of unmanned aerial vehicles

The modelerror estimation also has two steps:i) state estimation and discrepancy estimation using the invariant embedding method.ii) a regression method to estimate the parameters from the discrepancy time-history.Both the estimation before modelling and the model error estimation can be used forparameter estimation of a nonlinear system. The output error method and the filter errormethod can be used for nonlinear problems.

The feed forward neural network based approach somewhat parallels the two-step method-ologies, but it is quite distinct from these: it first predicts the measurements and thenthe trained network is used repeatedly to obtain differential states/measurements. Theparameters are determined by Delta method and averaging.

The recurrent neural network based approach looks quite distinct from many approaches,but a closer look reveals that the equation error method and the output error method basedformulations can be solved using the recurrent neural network based structures. In fact,the equation error method and the output error method can be so formulated withoutinvoking recurrent neural network theory and still will look as if they are based on certainvariants of the recurrent neural networks. This revealing observation is important frompractical application of the recurrent neural networks for parameter estimation, especiallyfor on-line/real-time implementation using adaptive circuits/VLSI, etc. Of course, oneneeds to address the problem of convergence of the recurrent neural network solutions totrue parameters. Interestingly, the parameter estimation procedure using recurrent neuralnetwork differs from that based on the feed forward neural network. In the recurrentneural network, the so-called weights (weighting matrix W) are pre-computed using thecorrelation like expressions between x, x,u, etc. The integration of a certain expression,which depends on the sigmoid nonlinearity, weight matrix and bias vector and someinitial guesstimate of the states of the recurrent neural network, results into the new statesof the network. These states are the estimated parameters (of the intended state-spacemodel). This quite contrasts with the procedure of estimation using the feed forwardneural network. In feed forward neural networks, the weights of the network are not theparameters of direct interest. In recurrent neural network also, the weights are not ofdirect interest, although they are precomputed and not updated as in feed forward neuralnetworks. In both the methods, we do not get to know more about the statistical propertiesof the estimates and their errors. Further theoretical work needs to be done in this direction.

The genetic algorithms provide yet another alternative method that is based on directcost function minimization and not on the gradient of the cost function. This is veryuseful for types of problems where the gradient could be ill-defined. However, the ge-netic algorithms need several iterations for convergence and stopping rules are needed.One limitation is that we cannot get parameter uncertainties, since they are related tosecond order gradients. In that case, some mixed approach can be used, i.e., after theconvergence, the second order gradients can be evaluated. Parameter estimation workusing the artificial neural networks and the genetic algorithms is in an evolving state. Newresults on convergence, uniqueness, robustness and parameter error-covariance need to beexplored. Perhaps, such results could be obtained by using the existing analytical resultsof estimation and statistical theories. Theoretical limit theorems are needed to obtain more

7

Page 21: low cost frame work for parameter identification of unmanned aerial vehicles

confidence in these approaches. The parameter estimation for inherently unstable/aug-mented system can be handled with several methods but certain precautions are needed.The existing methods need certain modifications or extensions, the ramifications of whichare straightforward to appreciate. On-line/real-time approaches are interesting extensionsof some of the off-line methods. Useful approaches are:i) factorisation-Kalman filtering algorithm.ii) recurrent neural network.iii) frequency domain methods.

2.2 Work achieved in Aerospace departmentThrough the last four years, various approaches to the autopilot design were made. In2011, work started to get uncalibrated sensors and after then be filtered and calibrated.The sensors varied from a very cheap one with low accuracy to the very expensive one.Thetwo approaches where canceled, the expensive one for budget and the lower one for lowaccuracy.Then in the same year another approach was chosen to use a full autopilot andupload a customized control gains for a flying wing UAV previously calculated. Micropilotautopilot was chosen for the task, the autopilot wasn’t easy to deal with like other opensource autopilots like ArduPilot. The autopilot had a problem in wireless module andthe warranty period was finished before the start of the project. The autopilot stoppedworking after a while and the customer services told us that we have to send it to themto be fixed and return back all on our charge.In 2012 atrial to use ArduPilot shows thatArduPilot is easier and extremely cheaper than MicroPilot with appealing results forstudent projects.ArduPilot board like Micropilot board wasn’t easy to replace a fault partor to add a custom sensors. In 2013 we began to purchase the ArduPilot inner componentsseparately i.e microcontroller and sensors.Work started to build our own autopilot circuitbased on Arduino mega microcontroller and Arduimu sensor. In 2014, a first DGM wasbuilt to be used for PI.In the same year a stability control autopilot was built based on thisDGM and displacement autopilot was successfully done.In 2015 we did the offline PI forboth longitudinal and lateral aircraft dynamics using the customized DGM with a GCSbased on Simulink.

For the Control Algorithms, In 2011 we built a Linear and nonlinear simulation ofthe aircraft using Matlab and Simulink, then Designing a Linear PID control for thealtitude hold autopilot.Verification of the gains on the nonlinear simulation was done anditerated till getting a better nonlinear response of the altitude hold autopilot. The parameteridentification was done based on Roskam aerodynamic stability and control derivativesbook using a Matlab code. In 2013,another project modified the code to be generally usedfor any dimensions of a conventional aircraft. The code was used to design PID controllerfor altitude hold of Bixler RC model. The gains were very near to the documented gainsof Bixler used in ArduPilot autopilot but not so good in the lateral gains.Now Roskamcalculated parameters was successful for longitudinal dynamics but not sure for the lateralones. Aircraft aerodynamic stability and control derivatives ASCD change from a flightcondition to another.ASCD changes if a destruction or a modification occurred to anypart of the aircraft.New aircrafts need to define its ASCD using real experiments. Usingequation of motion to find the dynamics equations of an aircraft may be a difficult workand also need to be verified. All these difficulties are solved using real flight tests designed

8

Page 22: low cost frame work for parameter identification of unmanned aerial vehicles

to do PI for aircrafts. A lot of work was done for PI based on real flights either usingoffline PI or online PI. Doning a survey on offline PI, It was found that output error methodwas the most famous used method.Studies shows that using NN for off-line PI is moreaccurate than the output error method which is the most used technique for offline PI(see(J.R. Raol, G. Girija and J. Singh, 2004)) [6]. In online PI Neural network is a very famoustechnique.So a better track for a long term PI work is to use Neural network for offlineand then do modifications to neural network to be used for online. So in this study we aremainly interested in studying the offline PI using NN.

9

Page 23: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 2.2: A sample of the used method for PI

10

Page 24: low cost frame work for parameter identification of unmanned aerial vehicles

Chapter 3: Parameter Identification3.1 IntroductionFor our major of study, parameter identification techniques are used to estimate aerody-namic and stability derivatives of the aircraft. Given these derivatives we could build amodel that describes the dynamics of the aircraft. A lot of methods were generated to dothis job. Roskam built a good method to obtain parameters via experimental tests. Eventhis method is more accepted for large aircrafts and depends on the calculated geometryand properties of the aircraft. Parameter identification PI using real flight data is the mostreliable technique to measure the aircraft parameters where mathematical model of theaircraft is not needed as you treat with the real body.Also changing in aircraft geometryor flight condition through flight is not a problem where you could deal with the realbody problem specially for on-line PI. There are two techniques of PI using real flightdata: on-line and off-line . On-line estimation algorithms estimate the parameters of amodel when new data is available during the operation of the model. In contrast, if youfirst collect all the input/output data and then estimate the model parameters, you performoff-line estimation. Parameter values estimated using on-line estimation can vary withtime, but parameters estimated using off-line estimation do not.In this way, off-line trainingis not restricted by the on-board computer’s computation power and the approximation canbe highly accurate.Traditional methods include recursive least-squares and the Kalmanfilter for parameter estimation (Ljung, 1999). Newer alternative methods include HopfieldNeural Networks (HNNs) for parameter estimation (see, for example, (Atencia, Joya, &Sandoval, 2004; Hu & Balakrishnan, 2005)). Neural networks are used mainly for on-linePI.The off-line learning cannot take the advantage of a neural networks learning ability toadapt to the changing aircraft dynamics during the flight. The on-line learning methoduses the real-time flight data obtained for training and has much higher requirements foron-board hardware and software implementation. However, this method has the ability tolearn the changing aircraft dynamics and give the controller some level of intelligence”.The on-line NNs learning speed and accuracy is limited by the natural frequency of thesystem, the on-board computation power and available resources. This study benefits isto learn NN for off-line PI so it may be easy using it when trying the on-line parameterestimation. Also the onborad computer is restricted to be the lowest cost.Studies showsthat using NN for off-line PI is more accurate than the output error method which is themost used technique for offline PI(see (J.R. Raol, G. Girija and J. Singh, 2004))[6].So inthis study we are mainly interested in studying the offline PI using NN.

3.2 Artificial neural network

3.2.1 IntroductionArtificial neural network ANN is a try to get a mathematical model for the human brain.The human brain is consisting of a very large number of neurons. The artificial neuronreceives one or more inputs (representing the one or more dendrites) and sums them toproduce an output (representing a biological neuron’s axon). Usually the sums of each

11

Page 25: low cost frame work for parameter identification of unmanned aerial vehicles

node are weighted, and the sum is passed through a non-linear function known as anactivation function. The activation functions usually have a sigmoid shape. Modeling ofa system using artificial neural networks has recently become popular with applicationto signal processing, pattern recognition, system identification and control. Estimationof parameters using empirical data plays a crucial role in modeling and identification ofdynamic systems. Often equation error and output error methods are used for parameterestimation of dynamic systems. These are generally batch iterative procedures where a setof data is processed to compute the gradient of a cost function and estimation error. Theestimation of parameters is then refined using an iterative procedure based on the improvedestimates of error and its gradients. Such methods can be termed as batch iterative. Theartificial neural networks provide new/alternative paradigms to handle the problem ofparameter estimation with potential application to on-line estimation. Especially recurrentneural networks are easily amenable to such possibilities due to their special structure-feedforward neural networks with feedback feature. In order to obtain fast solutions, a systemof parallel computers can be used. This will require the parallelization of the conventionalparameter estimation algorithms. Since artificial neural networks have massively parallelprocessing capacity, they can be adapted to parameter estimation problems for on-lineapplications. In particular, the recurrent neural networks can be considered as moresuitable for the problem of parameter estimation of linear dynamical systems, as comparedwith perhaps feed forward neural networks. The recurrent neural networks are dynamicneural networks, and hence amenable to explicit parameter estimation in state-spacemodels. One of the architectures of RNN is the Hopfield neural network HNN. In thefollowing section HNN will be introduced to be used for PI.

3.2.2 Hopfield neural network3.2.2.1 Introduction

One of the milestones for the current renaissance in the field of neural networks was theassociative model proposed by Hopfield at the beginning of the 1980s. Hopfield’s approachillustrates the way theoretical physicists like to think about ensembles of computing units.No synchronization is required, each unit behaving as a kind of elementary system incomplex interaction with the rest of the ensemble. Hopfield neural network HNN couldbe successively used for the problem of PI. First we have to introduce HNN as a generalneural network. For Human neural network,let’s have an example.a man recognizes photosby both eyes. For a sick eye, the brain tries to depend on the better eye more than the otherone hence sends more blood to the better one.And perceptrons between eyes and the brainbecome weaker in the sick one. It looks like brain take more weighting to the healthy eyeand less weighting to the sick one. Doctors advice to treat the sick eye because it now isgoing to become weaker and weaker because the brain tries to neglect it time by time.Thisweightings is what we try to do when trying to descibe what happens mathematically.A simple way to show that as in figure 3.1. So to get the right output, it is a must firstto know the right values of weights and this is called learning. Then After getting theappropriate weights and applying the input signals you could get the output and this iscalled validation. Assume a discrete problem where it is required to recognize the Ccharacter when writing it in a 4*3 pixels as shown in figure 3.3a. When writing the Ccharacter as in figure 3.3b, The solid pixels are represented as 1 and the unsolid as -1 as in

12

Page 26: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 3.1: An artificial neuron as used in a Hopfield network

figure 3.3c.From Hopfield Neural Network the weights are obtined between each pixeland the other pixels as shown a sample of the weights in figure 3.3d and by the languageof neural network pixels are now neurons where:

wi j = xi ∗ x j (3.1)

where i and j are the neuron index and x is the value of the neuron 1 or -1 . By obtainingthese weights, learning step has now ended. Now the validation where the pixels are let forsome one to write the C character on them as seen in figure 3.3e. What has been writtenis transfeered into 1 and -1 as in figure 3.3f. The neural network now try to change thepixels 1 and -1 values till reach a learned character which here is only the C character. Toupdate a neuron value hopfiled uses the following rule:

xi = sign(n∑

j=1

ωi jx j) (3.2)

This equation is proofed that it will never diverge [6]. The values of the neurons will movetowards the nearest shape to the C character. So it may reach the C character or it willnot go worse. So the sign apperaed in equation 3.2 shows an added block to HNN shownin figure 3.1 which is added after the summation rectangualr block which in general forcontinuous system will not be a sign but generally will be a sigmoid function where thesign is a special case in it as shown in figure 3.2.

Figure 3.2: An artificial neuron as used in a Hopfield network with added sigmoid function

13

Page 27: low cost frame work for parameter identification of unmanned aerial vehicles

This was a roughly description to what happen in Hopfiled neural network.

(a) 4*3 Pixels (b) C character (c) A subfigure (d) Sample of the weights

(e) A subfigure (f) A subfigure

Figure 3.3: Example for Discrete image recognition using Hopfield Neural Network

3.2.2.2 Parameter Estimation Problem

The Proof was given in Raol and others[6] and the main theory was revised from its mainpaper[7] . Consider the state space representation of a dynamical system

x = Ax + Bu (3.3)

The aim is to solve for A,B the system parameter matrices knowing x, x and u. Herex is (n×1) state vector, u = u(t) is the control input (scalar), A = [ai j] is n×n matrix andB = [bi] is n×1 vector. The parameters ai j (elements of A) and bi (elements of B) to beestimated are denoted by np×1 β vector where:

β = [a11,a12....aln,a21,a22...a2n..an1,an2...ann,b1,b2...bn] (3.4)

where np is the number of parameters to be estimated i.e. (n2 + n). Next we describethe dynamics of HNN as

υ =

n∑j=1

ωi jβi + bi (3.5)

Our problem is to define the Weighting matrix W, the bias vector b, choose the sigmoidfunction f and its coefficients then use these information to get the change in the parametersi.e. β then get the new parameter values as βnew = βold + βδt .

Given e(k) as the equation error where e = x−Ax−Bu we can define the cost functionas:

E(β) =12

n∑k=1

eT (k)e(k) =12

n∑k=1

(x−Ax−Bu)T (x−Ax−Bu) (3.6)

14

Page 28: low cost frame work for parameter identification of unmanned aerial vehicles

Also we haveE =

12

∑i

∑j

ωi jβiβ j−∑

i

biβi (3.7)

as the energy landscape of the recurrent neural network.Now from optimization theory bydifferentiating the cost function w.r.t β we have

∂β

∂t= −

∂E∂β

= −12∂(∑n

k=1 eT (k)e(k))∂β

(3.8)

Since β as a parameter vector contains the elements of A and B, we can obtain expressions∂E/∂A and ∂E/∂B for A and B vectors.

∂E∂A

=

n∑k=1

(x−Ax−Bu)(−x)T = A∑

xxT + B∑

uxT −∑

xxT (3.9a)

∂E∂B

=

n∑k=1

(x−Ax−Bu)(−u) = A∑

xu + B∑

u2−∑

xu (3.9b)

By expanding the elements of the above matrices we could get ∂E/∂ai j and ∂E/∂bi j . Forexample

∂E∂a11

= a11

∑x2

1 + a12

∑x1x2.........+ b1

∑x1u......−

∑x1x1 (3.10)

Again by differentiating the energy equation w.r.t β we have

∂E∂βi

= −

n∑j=1

ωi jβi−bi (3.11)

By comparing equations 3.10 and 3.11 we could find the elements of both the weightingmatrix W and the bias vector b. Now Since βi = f (υ) then υ = f −1(βi) then

υi =∂( f −1(βi))

∂ββi (3.12)

So we could define the change in parameters β as

βi =1

∂( f−1(βi))∂β

υi =1

∂( f−1(βi))∂β

n∑j=1

ωi jβi + bi (3.13)

And by choosing a suitable sigmoid function we could get β. Assuming

f (υi) = ρ(1− e−λυi

1 + e−λυi) (3.14)

then

βi =λ(ρ2−β2

i )2ρ

n∑j=1

ωi jβi + bi (3.15)

15

Page 29: low cost frame work for parameter identification of unmanned aerial vehicles

Finally to get the new parameters vector β we have

βi(k + 1) = βi(k) + βδt (3.16)

βi(k + 1) = βi(k) +δtλ(ρ2−β2

i )2ρ

n∑j=1

ωi jβi + bi (3.17)

Figure 3.4: Flow chart for Hopfield Neural Network PI

3.2.3 ResultsA Matlab code was built depending on the previous illustrations. A set of data wasgenerated from a given linear state space model.The model has four states and one inputsignal with 20 parameters. The estimation was carried out using noise free data and againwith additive noise(SNR=10). The tuning parameters λ and ρ were kept at 0.001 and 200respectively.Sampling time was set at .00025 sec. It was noted that RNN took around200,000 iterations before the convergence of estimated parameters to true values for thefree noise signal and about 900,000 iterations for the noisy signal. Table 3.1 shows theestimated parameters for the two data sets.Error was shown as only the difference betweenthe real and estimated value.

16

Page 30: low cost frame work for parameter identification of unmanned aerial vehicles

Table 3.1: Estimated elements of A and B matrices

True Values Estimated Error Estimated Erroriter 200000 900000

SNR SNR=inf SNR=10-0.045 -0.045003671 3.6707E-06 -0.027277008 -0.017722992-0.036 -0.036021552 2.15516E-05 0.173624619 -0.209624619

0 0.00038355 -0.00038355 -12.22945959 12.22945959-30 -30.00003794 3.79433E-05 -26.46613445 -3.533865551

-0.369 -0.369084965 8.49649E-05 -0.075466871 -0.293533129-2.02 -2.020498759 0.000498759 -0.184922364 -1.835077636

1 1.008898544 -0.008898544 0.567241605 0.4327583950 -0.000878168 0.000878168 2.034148622 -2.034148622

0.0019 0.000911806 0.000988194 -0.010721288 0.012621288-0.0396 -0.045400063 0.005800063 0.010295163 -0.049895163-20.948 -20.84433291 -0.103667093 -16.91628511 -4.031714892

0 -0.010214947 0.010214947 0.375789166 -0.3757891660 0.000588044 -0.000588044 0.002145217 -0.0021452170 0.003451917 -0.003451917 0.018468908 -0.0184689081 0.938412706 0.061587294 0.963367259 0.0366327410 0.006077832 -0.006077832 0.040599949 -0.040599949

0.56 0.560001413 -1.41341E-06 -0.969877568 1.529877568-0.8 -0.79996698 -3.30197E-05 -0.536048355 -0.263951645-0.5 -0.499613434 -0.000386566 -0.478945969 -0.021054031-0.7 -0.700228534 0.000228534 -0.653822901 -0.046177099

Error -0.04314389 1.467580966

Estimated data for both data cases are shown in figures 3.6, 3.7 and 3.8 . Figure 3.6shows the real and estimated states and change in states. Figure 3.7 shows the true A and Bmatrices elements and the estimated parameters. Figure 3.8 shows the error in parameterswith iterations.

A study [6] was shown that using an input with 3211 signal will excite the long andshort period in the aircarft. Also using a sampling frequency when collecting data 0.01Hz will be better in PI problem [10]. In the following table four situations will be studiedshowing the impact of sampling frequency and the type of signal on HNN PI problem.

Table 3.2: Results of different combinations of data Sampling frequency and the type ofthe input signal

100 HZ 20 HZ Comment

3211 inputError= 0.0359 %

Time = 63.485 secError= 0.2744 %Time = 64.72 sec

Long period ObtianedShort Period Obtained

Pulse inputfor 2 sec.

Error= 0.0199 %Time = 64.94 sec

Error= 0.1912 %Time = 65.214 sec

Long Period (Not Obtained)Short Period Obtained

Doublet Input0.5 Hz

Error= 0.0362 %Time = 66.538 sec

Error= 0.3053 %Time = 68.0966 sec

Long Period (Not Obtained)Short Period Obtained

17

Page 31: low cost frame work for parameter identification of unmanned aerial vehicles

(a) S NR =∞

(b) S NR = 10

Figure 3.5: Real,measured and Predicted data for states and states rate

18

Page 32: low cost frame work for parameter identification of unmanned aerial vehicles

(a) S NR =∞

(b) S NR = 10

Figure 3.6: Real,measured and Predicted data for a sample state

19

Page 33: low cost frame work for parameter identification of unmanned aerial vehicles

(a) S NR =∞

(b) S NR = 10

Figure 3.7: True and estimated parameters

20

Page 34: low cost frame work for parameter identification of unmanned aerial vehicles

(a) S NR =∞

(b) S NR = 10

Figure 3.8: Error in parameters

21

Page 35: low cost frame work for parameter identification of unmanned aerial vehicles

Chapter 4: Data Gathering Module4.1 IntroductionAutopilot is the brain of an aircraft. It is responsible of all orders given to the aircraft.Itcollects data, processes it and finally makes the decision. The whole autopilot systemcontains from its hardware elements ,the software saved on it and the ground controlstation GCS. Autopilot hardware circuit contain all the elements required to make theproper sensing to the aircraft and its situation like altitude, attitude,velocity, surroundingatmosphere and so on and so far. Autopilot software transfer all the sensors data to theautopilot processor.Autopilot software analyses the data in the processor unit then makethe decision and send it to the aircraft actuators.For now we are just using the autopilot as adata gathering module.So data is just collected and stored with no resulting control action.In the following sections a brief description will be presented about the hardware circuitcomponents, specifications and limits then a description to the data gathering modulesoftware and GCS software .

4.2 HardwareThe DGM designed and tested in ASTL was of total weight about 250 gm and externaldimensions about 9x3x5 cm.The parts chosing crietrai was to get the best parts in thelocal market.Most of parts were getting from Ram and future electronics, see[2],[3] . It isconsisting of 3 PCBs the first main one contains the microcontroller and the sensors andthe second one is a servo set PCB to connect between servos and the main PCB.The lastone conects between analog inputs (potentiometers and velocity sensors) and the mainboard. The work done at west Virginia university was very helpful during the choosing andtesting of sensors.Dr Brad seanor did his PhD at 2002[10] and then he was of supervisorsof Dr YuGu during Dr Yu Gu PhD in 2004[5] and Eng. Amanda K. McGrail during herMsc. in 2012 [8].

22

Page 36: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 4.1: Block Diagram for the whole system processes

23

Page 37: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 4.2: Data gathering Module Hardware

4.2.1 MicrocontrollerAs shown in figure, the used microcontroller was Arduino Mega 2560 16 MHz, 256KB RAM. It collects data from various sensors and sends it to the computer via serialcommunication or through wireless via wireless module. It is responsible on controlcommands. The board also connects between the remote controller and the servos for thepurpose of manual flights. Its limitations come from the low processor and the low RAMcapacity.

Figure 4.3: Arduino mega 2650 microcontroller

4.2.2 Microcontroller shieldThe Arduino shield is a two layer PCB. This customized shield was used to connect themicrocontroller with the various sensors.

24

Page 38: low cost frame work for parameter identification of unmanned aerial vehicles

(a) Front Layer (b) Back Layer

(c) Implementation - Lower Layer

(d) Implementation - Fixed with Arduino

Figure 4.4: Arduino Customized shield

4.2.3 Servo MotorsThe used servo motor was the Standard High-Torque BB Servo Futaba S3010. Its speed is0.20 sec/60 @ 4.8V

Figure 4.5: Servo Motor

25

Page 39: low cost frame work for parameter identification of unmanned aerial vehicles

4.2.4 Servos and potentiometer SetsThis is a customized PCB to connect the servos to the main Arduino shield. This board isgood for longer servos distributions and to simplify the main Arduino shield. There aretwo boards of this kind one for servos and the other one for potentiometers

(a) Implementation (b) PCB Layout

Figure 4.6: Servos Set

4.2.5 PotentiometersPotentiometers are used to measure the control surfaces deflections. The output is used tobe stored as the input signals. Input signals are used for parameter identification purposesas will be shown later.

Figure 4.7: Potentiometer

4.2.6 IMUAs shown in figure, the used IMU was Arduimu V3. It is responsible for measuring theEuler angles , and , the Euler angles rates p, q and r and the accelerations ax, ay andaz. Specifications of this Arduimu comes from the motion processing unit used insideit which is MPU-6000. MPU-6000 comes integrated with a 3-axis MEMS gyroscope, a3-axis MEMS accelerometer and a 3-axis magnetometer. The used ranges are 500 /sec forgyroscope and 4g for the accelerometer, see [12].

Figure 4.8: Arduimu V3

26

Page 40: low cost frame work for parameter identification of unmanned aerial vehicles

4.2.7 Pressure SensorThe used atmospheric pressure sensor was BMP085 Module GY-65.The sensor has arange of 30:110KPa ( altitude 9000:-500m) with a resolution ratio of 6 Pa (0.5 meter) inlow power mode and 3 Pa (0.25 meter) in high linear mode. The response time equals 7.5ms, see [11].

Figure 4.9: BMP085 Module GY-65 Pressure Sensor

4.2.8 Velocity sensorThe MPXV7002DP series piezoresistive transducer is a monolithic silicon pressure sensorwith a pressure range of 2 kPa and 2.5 % typical error and 6.25% maximum error over+10C to +60C.

Figure 4.10: Velocity Sensor

4.2.9 Wireless moduleThe uses wireless module was Xbee PRO 52B.It is responsible for data transmissionbetween autopilot and the ground station. It has an outdoor line-of-sight at about 1500m.

Figure 4.11: Xbee pro 52B wireless module

4.2.10 GPS moduleCJMCU-108-H is a UBLOX GPS with 5 Hz update rate and -148 dBm acquisitionsensitivity (cold start), -162 dBm tracking sensitivity. The error in location is about 1.5 m.

27

Page 41: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 4.12: GPS Module

4.2.11 Alpha-Beta sensorAir-data sensor is an alpha beta probe used for measuring the angle of attack and side slipangle.The sensor was difficult to be obtained from the market for small aircrafts and theavailable one was very expensive. So the sensor had to be built.The design of the fens wastaken from the Open Air Data Computer project through the web site[4]. The obtainedfens was capable of measuring the AOA when the velocity reaches about 14 m/sec. andthe alpha and beta fens were collected in one set. So a modification on the design weredone.First a 1.8 scaled fens was built to get readings at a velocity about 7 m/sec.Secondthe fens were separated one at the right wing tip and the other one at the left wing tip.Thisallow a better weight distribution and to be away from the propeller flow at the middle ofthe aircraft. A 3D model for the whole sensor was built then manufactured in the ASTL.After the first test flight the system was modified as shown in the flight test chapter later.

Figure 4.13: First Design (Not used because of middle flow from motor propeller)

28

Page 42: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 4.14: Air data fin drown by Solid Works

Figure 4.15: CNC manufacturing of Air data fin

29

Page 43: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 4.16: Air data fin for Side Slip angle

Figure 4.17: Air data fin for angle of attack

30

Page 44: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 4.18: The total AOA and Side slip sensors attached to the wing tip

4.2.12 BatteriesThe Power source of the Arduino mega and the servos was separated then collected atonly one ground. The Arduino used a battery of 7.4 volts with 2500 mhA which has avery good capacity that could sustain for a very good time without needing recharging.The servos are using the standard Ni-Cd 4.8 volt batteries.

4.2.13 RC Transmitter and receiverA Futaba RC Remote was used for manual control of the servos. It sends signals to thereceiver. The receiver is connected to Arduino. Finally Arduino sends signals to servos.

4.3 Software

4.3.1 Arduino Mega SoftwareSensors Signals to Microcontroller Software is used to transfer data from sensors tothe Arduino microcontroller. The software is based on Arduino programming program.The software collects about 27 readings from various sensors. The data gained couldbe used for parameter identification and for control purposes The output data are inASCII format. Through signal monitor data could be seen sending row by row. Each

31

Page 45: low cost frame work for parameter identification of unmanned aerial vehicles

row contains all the sensors readings as follow: RLL:0.07,PCH:0.08,YAW:0.33,altitude:-0.50,velocity:0.00,elev:-683,ail:-713,rdr:-713,alpha:-737,beta:-690,AN0:0.13,AN1:-0.62,AN2:0.22,AN3:17.78, AN4:-0.08,AN5:8190.33,lat:0.00,lon:0.00,gpsalt:17.00 , gpsgc0.00,gpsgs0.00, fix:0,numsat:0,elevs:0,ails:0, ruds:0,throts:0. RC receiver is connected to Ar-duino mega through its digital ports. So Arduino could collect the RC remote signal tostore it then sends it to the servos through the PWM ports.The main code for Arduimu wasgiven from its website [12] and then modifications was done on the code as in AppendixC to get the right angles and some website were very helpful in that, see[1].

Figure 4.19: Flow Chart of Arduino Software

4.3.2 Monitoring SoftwareMonitoring software is used to plot the various received data from the autopilot in realtime. This software could also save the data for after-processing. Saved data could beused for analyzing the recorded flight and for off line parameter estimation purposes. Thesoftware was built in Simulink environment. The autopilot sends data in ASCII formatthen the monitoring software convert it to decimal. The software is connected to Googleearth program. Google earth receives the GPS data from the Simulink program to plot theaircraft location using longitude and latitude readings , see Appendix D.

(a) the whole program(b) ASCII to deci-mal Block

Figure 4.20: Monitoring Software

32

Page 46: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 4.21: Simulink GCS

Figure 4.22: Flow Chart of Simulink GCS

4.3.3 Parameter Identification SoftwareParameter identification software is a generated Matlab code. The inputs to the code arethe measured data received from the autopilot for the sensors signals and control surfacesdeflections signals. The code is based on Hopfield neural network as will be discussedlater. The outputs from the code are the A, B matrices of the linear state space model forlongitudinal and lateral aircraft dynamics. The whole software is shown in Appendix A.

33

Page 47: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 4.23: Flow chart for Hopfield Neural Network PI

34

Page 48: low cost frame work for parameter identification of unmanned aerial vehicles

Chapter 5: Test flights and Results5.1 KADET LT-40 RC model Test Flight

5.1.1 IntroductionThe used aircraft specifications are shown in table 5.1. The test flight was performed at9:00 am and the Airflow was quiet and suitable for the test. The test flight was performedfor about 5 min. as shown in the map in the red path.Through the flight the data werecollected and stored into the GCS.Post processing was performed

Figure 5.1: The Flight Path

Table 5.1: KADET LT-40 RC model Specifications

Wing Span 1778 mmWing Area 58.1 dmLength 1447 mmFlying Weight 2490 - 2720gWing Loading 43 - 47 g/mEngine 2-Stroke .46 cu. in. (7.5 cc)

35

Page 49: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 5.2: Ready to Fly

Figure 5.3: Captured photo from the flight test

5.1.2 PI of Longitudinal DynamicsFor the linearized longitudinal equation of motion we have

uαqθ

= A

uαqθ

+ Bu (5.1)

Our aim now is to do the following:1- Collect all the available data from the state space vector, its derivative and the inputvector i.e (u, α, q, θ,u,α,q, θ,u).2- Estimate the data that was not capable to be measured.3- Identify the A and B matrices.

36

Page 50: low cost frame work for parameter identification of unmanned aerial vehicles

The unavailable data that cannot be measured were the angular acceleration term q andthe angle of attack rate (AOA) α.For q the angular rate q was differentiated to get q andthis required a high sampling frequency to get accurate estimation usually 100 Hz is usedbut in this test 10 Hz only was used and this was modified in the other test flight for LowSuper trainer RC model.Also u was replaced with the linear acceleration in x direction axand α was replaced with the linear acceleration in z direction az .So finally the state spacemodel becomes

axazqθ

= A

uαqθ

+ B[δelev

](5.2)

In the flight the throttle was set constant and then the δthrottle was neglected.The datawere added to the Hopfield Neural Network code and using a good sample from the testwith smooth outputs and inputs for about 4 seconds. The code was designed to iterateand get estimated A,B matrices. These A,B matrices are now the estimated linear statespace model to the aircraft.input data collected from the test are applied to this model andcompare its states response to the measured states.Error calculated was about 7.5 % where

E =

∑s1∑n

1 |ei|

n(5.3)

where s is the number of states and n is the number of samples for each state.Themeasured and estimated states are shown in figure 5.5 .

37

Page 51: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 5.4: Longitudinal States (Measured and estimated).The yellow line is the elevatorsignal

A remark was shown from this test is that the pitch angle (the 4th figure)indicates thatwe have two peaks one at the 0.5 sec and the other one at about 2.3 sec and the differencebetween the measured and estimated at 2.3 is higher than the one at 0.5.We can remarkthat also the rate of change of the elevator signal is higher at 2.3 more that its at 0.5. Thisindicates that the aircraft response is changing not only with the elevator signal but alsowith the rate of change in elevator signal. So another model was suggested to considerthat we have not one input signal but two input signals the elevator signal and the changein elevator signal (δelev, δ ˙elev) . Now the state space model become as follow

axazqθ

= A

uαqθ

+ B[δelevδ ˙elev

](5.4)

By using the new state space model the error was reduced from 7.5 % to 6.3% and theresults are shown in figure 5.5. Another remark was shown now that the alpha signal hasthe maximum error with 59.37 % of the total error.The error in other 3 states is equal only2.56%.By this remark it was suggested to use another AOA and side slip sensor.

38

Page 52: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 5.5: Longitudinal States (Measured and estimated) when using elevator and changein elevator as inputs.The yellow line is the elevator signal

The estimated state space has the following formaxazqθ

=

0.0858 0.2892 −0.0004 −9.7233−0.1890 −4.8213 21.2426 −0.0004−0.0335 −0.1529 −4.6181 0.0201−0.0091 −0.0076 0.9953 −0.0074

uαqθ

+15.6149 −1.2472−0.0070 −0.0070−6.5863 −0.8107−1.3036 −0.4347

[δelevδ ˙elev

](5.5)

The A matrix shows a good match in the parameters order of magnitude with the linearstate space model derived in Nelson Flight Stability and Automatic Control book[9] as inequation 5.6.

uwqθ

=

Xu Xw 0 −gZu Zw uo 0

Mu + MwZu Mw + MwZw Mq + Mwuo 00 0 1 0

uwqθ

(5.6)

The root locus of the longitudinal dynamics is shown in figure 5.6.It is shown thatthe long period poles are lying on the real axes and it has two options. The first one isthat this is our situation and this is our aircraft dynamics.The second option is that wedidn’t get the aircraft the sufficient elevator signal to force the aircraft to bring out all its

39

Page 53: low cost frame work for parameter identification of unmanned aerial vehicles

dynamics.So it is suggested in the coming test to use a doublet elevator input which isvery good in doing the job.

Figure 5.6: Estimated Pole map of the Longitudinal dynamics

5.1.3 PI of Lateral DynamicsFor the linearized lateral equation of motion we have

βprφ

= A

βprφ

+ B[δr]

(5.7)

In this test we will differentiate all the four states to get their derivatives as it is notavailable to measure them directly.This is also a recall to use higher sampling frequencyin the coming tests to get a good differentiation to the states. After setting the HNN codewith the lateral data and iterate, the state space model was reached as shown in equation5.8.The order of magnitude of the parameters is reasonable with the general lateral statespace model derived in Nelson[9] as shown in equation 5.9

βprφ

=−10.4457 −0.0015 −3.8845 0.5912−9.4239 −6.3545 −9.6868 0.119618.6195 −0.3073 2.4556 0.6212−0.0517 1.6458 0.6615 0.0308

βprφ

+

8.261845.1595−15.6159−2.4332

[δrdr]

(5.8)

βprφ

=

Yβuo

Ypuo−(1− Yr

uo) gcosθo

uoLβ Lp Lr 0Nβ Np Nr 00 1 0 0

βprφ

(5.9)

40

Page 54: low cost frame work for parameter identification of unmanned aerial vehicles

The estimated lateral dynamic system is fed with the measured aileron data to get theestimated states. Estimated and measured states are shown in figure 5.7.

Figure 5.7: Lateral States (Measured and estimated).The last line is the aileron signal

Again a recall that the AOA and side slip sensor must be changed. The Total errorbetween measured and estimated states calculated by equation 5.3 was equal 4.7% and theerror in side slip angle has alone a 42.13% from the total error where error in other threestates is only equal 2.72 %.

The root locus of the lateral dynamics is shown in figure 5.8. The roots of theconventional aircraft are delivered : the spiral role root, the two imaginary poles of thedutch roll mode and the role pole.

41

Page 55: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 5.8: Estimated Pole Map of the Lateral dynamics

5.1.4 Test Summary and recommendationOverall the results were appealing with total error equal 6.3% and 4.7% for longitudinaland lateral dynamics perspectively. The total system was successful to estimate the lineardynamics of the aircraft. In each stage suggestions were appearing with the results. Hereis a summary of this cautions to be taken under consideration in the next flight.

1. A better A/C is recommended because the used one was very old and has difficul-ties while flying which prevent from apply a full doublet signals to the aircraft as it tendsto diverge and moving unstable.

2. Electric Motor is recommended for constant thrust and also more clean.But for thismagnetic field must be taking under consideration

3.A Better Design of AOA and Beta sensors is very recommended because the high-est value of error was coming from them.

4. Central Position for AOA and Beta sensors must be studied because when the fins arein the tip, they are affected with the rolling motion.

5.It was noticed that the servos have some jumpy signals when no orders are send tothem. This happen around one time each 10 seconds.

6.A Better GCS Software is recommended for better inspection of the test while it isperformed to be able to assure the test was done in the right way. For example The servossignals should be shown to assure that the doublet input is done. The AOA and side slipsensors data should be shown to assure that they are normal with no ups and downs.

7. For servo to work: The transmitter sends data to receiver, then from receiver to

42

Page 56: low cost frame work for parameter identification of unmanned aerial vehicles

Arduino then Arduino saves data and also sends it to the servo. In this circuit if a problemoccurred to the Arduino board, The signal will not be able to transfer from transmitter tothe servos and the aircraft then is out of control. So a relay must be added to be able toswitch the signal directly from receiver to the servos if Arduino suddenly stopped working.

8.For the collected data to be saved the Arduino sends them by wireless Xbee mod-ule to the GCS and they make a restriction on the sampling frequency and if the Xbeestopped working the data will not be able to be sent to the GCS So a data logging SD-Cardis recommended.

9.A newer PCB will be a newer edition to the data gathering system and it will bebetter for a better sensors arrangement and to be able to put the system in a closed boxwith better wires arrangement and to ensure more safety for the circuit.

10. For the new aircraft the aerodynamic stability and control derivatives could becalculated with Datcom and compared with the Neural network results.

5.2 Low Super Trainer RC model

Figure 5.9: Low Super Trainer RC model

5.2.1 Modifications for Test 2From Test 1 some suggestions were given to et better results for test 2. The modificationswere done as follow:1- A new RC model aircraft was used for this test called ”Low super trainer”.Specificationsof the aircraft are shown in table 5.2. It has a takeoff weight for about 3.5 kg. In this newmodel we could guarantee that the Y plane is the a symmetric plane so a pure longitudinal

43

Page 57: low cost frame work for parameter identification of unmanned aerial vehicles

Table 5.2: Low Super trainer RC model Specifications

Wing Span 1650mmWing Area 45sq.dmLength 1250mmFlying Weight 3450gEngine required 2c 0.40 - 0.46 cu inEngine used Brushless motor 46

motion could be achieved. Also the engine was replaced with an electric motor for constantand a cleaner source of thrust as shown in figure 5.10.

Figure 5.10: Replacing the engine with an electric motor

2- The sampling frequency of collecting the data was doubled from 10 Hz to 20 Hz .This increase assure a better smooth data and also better values when differentiating thestates.3- The problem of alpha beta sensors was extremely solved here by the following modifi-cations:a- The fin was replaced with a similar fin but scaled by 0.7 . The new small fin couldguarantee a lower fin inertia and mass which could help in increasing the fin response tothe change in the flow direction. When the fin becomes low in its mass, it becomes moreeasy to rotate it with a low speed flow.b- Not only decreasing the mass will help, but also it is a must to decrease the potentiome-ter friction through using a very low friction encoder.c- A central positioning for the air prob could help to get the right air data but because theaircraft has a pusher motor fixed in the center, the air data sensors were fixed again at thetip of the wing. But to guarantee good results, two sets of air data sensors were fixed oneat the right wing tip and another at the left wing tip. Before the flight it was found that oneof the angle of attack sensors was stopped working so one AOA sensors was only used to

44

Page 58: low cost frame work for parameter identification of unmanned aerial vehicles

measure the angle of attack.d- limiters were added to the fins to make sure that fins will not rotate above 90 degreeand settle at the opposite direction. At last before the test flight, All sensors and controlsurfaces deflections were calibrated and tested before the test flight.

Figure 5.11: The fin of the old sensor and the smaller one of the new sensor

Figure 5.12: The black potentiometer and the golden Encoder

45

Page 59: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 5.13: The modified Air data system

Figure 5.14: The Air data systems setup for both aircrfats

46

Page 60: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 5.15: The Air data systems of both two aircrfats

Figure 5.16: The two sets of the new Air data system

The results of the sensors were very appealing and each pair has a very close results.

47

Page 61: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 5.17: The two velocity sensors data during the flight

Figure 5.18: The two beta sensors data during the flight

5.2.2 Flight testThe flight was done at about 3:00 pm in Sheikh Zayed flight area zone.The airflow wasquiet and suitable for the test. A failure was occurred in the electric motor and it suddenlystopped working during the flight. The plane was fall down and the engine mount wasbroken and one of air data sets was taken away from its place.But a flight for about 30seconds was successfully recorded, a doublet input for the elevator was achieved anda roughly pure longitudinal motion was done. In the coming sections longitudinal andlateral dynamics will be discussed.

5.2.3 Longitudinal Dynamics IdentificationTo activate both the short and long period dynamics it is recommended to use 3211input,see [6]. In this test a doublet input was applied with various frequencies and it was

48

Page 62: low cost frame work for parameter identification of unmanned aerial vehicles

helping in activating the dynamics even it is recommended to enhance it in the comingflights. In the input vector the throttle was added even it doesn’t changed and this ehnaced

Figure 5.19: Doublet elevator input

the results.It may be as a result of the initial value the throttle helps the states to start withand maintain at a certain value till changing with elevator or other states. The results showsthat the error was about 1.47% and the results of the alpha sensor was accepted not likethe results of test 1. Using a sampling frequency of 20 Hz helped also in differentiating thestates. Through the test, It was accepted to use u not using ax. Also using α not using az .

uαqθ

= A

uαqθ

+ B[δelevδ ˙elev

](5.10)

49

Page 63: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 5.20: Measured and Estimated Longitudinal States

For the poles, The results showed both the imaginary short and long period poles.Theorder of magnitude of A matrix was in good match with Nelson [9].

uαqθ

=−1.0422 −76.9002 −0.1338 −9.7870−0.1386 −15.8909 1 −0.01064.1345 −216.1426 −21.4856 −0.0027−0.0077 −0.0102 1.1431 −0.0126

uαqθ

+

47.8624 0.2262−2.3533 0.0215240.0769 −1.0633−0.0446 −0.0034

[δelevδthrotle

](5.11)

Figure 5.21: Longitudinal Pole Map

5.2.4 Lateral Dynamics IdentificationIt was not able to get a pure lateral and directional motion from the flight with doubletaileron or rudder as the engine was cutoff before the test and the only achievable control

50

Page 64: low cost frame work for parameter identification of unmanned aerial vehicles

surface deflection was the elevator. So we cannot make a final desision about the lateralidentification.However by testing the data it get some accepted results but the error cannotbe a rule here even it is lower than the first test(Error=4% and for test one it was 4.7% ) aswhen the yaw angle increase than 180 degree, the data has a fault jump from 180 to -180and this increases the error.The data measured and estimated and root locus results aregiven below.The order of dynamic matrix A is very close to that is given in Nelson [9].

Figure 5.22: Lateral States , Last line for Aileron

βprφ

=−44.7421 2.4141 2.0447 3.5779633.3455 −88.9262 −94.5293 −0.0327440.4360 −6.4201 −24.9187 −0.1893−0.0200 0.8450 −0.2985 −0.0426

βprφ

+−0.0130410.8903−310.0469

5.8755

[δail]

(5.12)

51

Page 65: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 5.23: Lateral Pole Map

52

Page 66: low cost frame work for parameter identification of unmanned aerial vehicles

Chapter 6: Conclusion and FutureWorkHopfield neural network combined with the designed framework delivered an appallingresults in the problem of offline PI. With the low Arduino mega processing power andRAM compared with other used microcontroller for PI purposes, It had an acceptedperformance and durability in the project. The purpose of the was to use low cost fordecreasing the funds but for a first trial of PI and to make just a proof of concept. HigherMicrocontroller is planned to be used for the second iteration.The need for real flightto know the dynamics of an aircraft seems to be a mandatory object in autopilot designand we cannot fully depend on Datcom or Roskam to get the dynamics of a small UAV.No remarkable work can be done by only one person, Team work and accumulation ofKnowledge is the way for successful and beneficial work.

6.1 The Future suggested work

6.1.1 Parameter IdentificationPI work may be a second iteration on Offline PI or Online parameter identification andhere are suggested disciplines

6.1.1.1 Hardware

1- Using Arduino 2- Using MyRio(Labview) 3- Build an IMU

6.1.1.2 Software

A- Monitoring Software Try to increase its speed. You may use Visual Basic,Lab view orSimulink. B- PI software You may iterate again for Neural network or try to use anothermethod may genetic alogrithm or frequency response techniques.You may use a newmethod for Offline or using NN for online.

6.1.2 AutopilotUsing the calculated state space models, you may now transfer to the new stage ofAutopilot: Stability control. Stability control was introduced in the introduction chapterbefore. You may try to work for Altitude hold then towards the heading autopilot andfinish your work by a horizontal loop mission.

53

Page 67: low cost frame work for parameter identification of unmanned aerial vehicles

Figure 6.1: Suggested Work

54

Page 68: low cost frame work for parameter identification of unmanned aerial vehicles

References

[1] DiyDrones. Diydrnes,http://diydrones.com/.

[2] electronics, R. Ram,http://ram-e-shop.com/oscmax/catalog/.

[3] futureelectronics egypt. futureelectronics egypt,http://www.fut-electronics.com/.

[4] George Zogopoulos Papaliakos, Graziano Capelli, J. L. J. Basic air data web-site,http://www.basicairdata.eu/.

[5] Gu, Y. Design and flight testing actuator failure accommodation controllers on WVUYF-22 research UAVs. PhD thesis, College of Engineering and Mineral Resources atWest Virginia University, Morgantown, West Virginia, 2004.

[6] J.R. Raol, G. G., and Singh, J. Modelling and Parameter Estimation of DynamicSystems, 1st ed. The Institution of Engineering and Technology, 2004.

[7] J.R. Raol, H. Neural network architectures for parameter estimation of dynamicalsystems. IEE Proceedings 143, 19960338 (July 1996), 386–394.

[8] McGrail, A. K. Onboard parameter identification for a small uav. Msc. thesis,College of Engineering and Mineral Resources at West Virginia University, 2012.

[9] Nelson, R. C. Flight Stability and Automatic Control, 1st ed. McGraw-Hil BookCompany, 1989.

[10] Seanor, B. A. Flight Testing of a Remotely Piloted Vehicle for Aircraft ParameterEstimation Purposes. PhD thesis, College of Engineering and Mineral Resources atWest Virginia University, Morgantown, West Virginia, ”2002”.

[11] Sparkfun. Sparkfunwebsite, https://www.sparkfun.com/.

[12] Team, A. O. http://https://code.google.com/p/ardu-imu/wiki/introductionpage.

55

Page 69: low cost frame work for parameter identification of unmanned aerial vehicles

Appendix A: Hopfield neural networkmatlab code

In t h i s code :1− choose t h e d a t a f i l e t o use2− i n i t i a t e wi th a A, B i n i t i a l3− choose ro t o va ry f o r s igmoid fn4− i n t r d o u c e t h e d a t a and s t a t e s6− choose lamda f o r s igmoid fn7− Get t h e w e i g h t i n g m a t r i x and t h e b i a s v e c t o r8− i t e r a t e t o g e t new A, B m a t r i x f o r each ro9− g e t t h e r e s p o n s e f o r each ro10− choose t h e l o w e s t e r r o r ro11− a g a i n g e t i t s A, B12− i t e r a t e wi th t h e new A, BHin t you may s t o p t h e program s e v e r a l t i m e s and f o r t h a tr e p l a e t h e l a s t g i v e n A, B == AA, BB wi th A, B i n i t a lHere i s t h e f u l l code :

c l o s e a l l ;c l e a r a l l ; c l c ;l o a d ’ t e s t 2 v b 2 . mat ’

%% TEst Regimes t a r t =671+3.7∗20+2.5∗20;%805s t o p =671+200; %906

%% d t and t imed t=q . t ime ( s t a r t +1)−q . t ime ( s t a r t ) ;s ample s=s top − s t a r t +1;t =q . t ime ( s t a r t : s t o p ) ;i t e r =1;s t e p p =1;d t t =1 ;

%% N eu ra l ne twork P a r a m e t e r sr 1 = [ −2 0 0 : 5 0 . 1 : 2 0 0 ] ;r 2 =[ − . 1 : . 0 3 1 3 : . 1 ] ;r 3 =[−4: 1 .40123 : 4 ] ;r 4 =[−400000: 10000 .21 : 4 0 0 0 0 0 ] ;r 5 =[ − . 01 : . 0 0 3 1 3 : . 0 1 ] ;k a t =[ r 1 , r 2 , r 3 ] ;

%% C o n t r o l S u r f a c e se l e v s = e l e v s e r v o . d a t a ∗ p i / 1 8 0 ;a i l s = a i l s e r v o . d a t a ∗ p i / 1 8 0 ;t h r o t l e s = t h r o t l e s e r v o . d a t a +10 ;r d r s = r d r s e r v o . d a t a ∗ p i / 1 8 0 ;

56

Page 70: low cost frame work for parameter identification of unmanned aerial vehicles

[ e l ev , a i l , t h r o t l e , r d r ] =

c o n t s u r f a c e s ( e l e v s , a i l s , t h r o t l e s , r d r s ) ;

e l e v=e l e v ( s t a r t : s t o p )+1∗ p i / 1 8 0 ;a i l = a i l ( s t a r t : s t o p ) ;t h r o t l e = t h r o t l e ( s t a r t : s t o p ) ;r d r = r d r ( s t a r t : s t o p ) ;

%% I n i t i a l c o n d i t i o n s

AA=[

0 0 .1353 1 .7769 −9.80 −9.8555 1 .2098 −0.02540 −184.2615 −9.7172 −0.17850 −0.0133 1 .1742 −0 . 0 6 1 3 ] ;BB=[

10 .8370−1.7455166 .11250 .0192] ;

% AA=[ 0 .0858 0 .2892 −0.0004 −9.7233% −0.1890 −4.8213 21 .2426 −0.0004% −0.0335 −0.1529 −4.6181 0 .0201% −0.0091 −0.0076 0 .9953 −0.0074 ] ;% BB=[ 15 .6149% −0.0070% −6.5863% −1.3036 ] ;AA= [

0 .1603 4 .7492 0 .1893 −9.1854−0.0236 −11.4540 1 .6679 −0.0170−0.3632 −51.8025 −12.2717 −1.0140−0.0073 −0.0221 1 .0347 −0.0267

] ;BB= [

4 .0488 0−3.6889 0110 .0381 00 .0087 0] ;

AA=[−1.0422 −76.9002 −0.1338 −9.7870

57

Page 71: low cost frame work for parameter identification of unmanned aerial vehicles

−0.1386 −15.8909 1 −0.01064 .1345 −216.1426 −21.4856 −0.0027−0.0077 −0.0102 1 .1431 −0.0126

] ;

BB=[47 .8624 0 .2262−2.3533 0 .0215

240 .0769 −1.0633−0.0446 −0.0034

] ;%% s i z e s & i n i t i a l c o n d i t i o n s[ ma , na ]= s i z e (AA) ;[mb , nb ]= s i z e (BB ) ;

%% STATESd a t a ( : , 1 ) = v e l d o t . d a t a ( s t a r t : s t o p ) ;

% d a t a ( : , 1 ) = axk . d a t a ( s t a r t : s t o p ) ;d a t a ( : , 2 ) = a l p h a d o t . d a t a ( s t a r t : s t o p ) ;d a t a ( : , 3 ) = qdo t . d a t a ( s t a r t : s t o p ) ;d a t a ( : , 4 ) = pc hd o t . d a t a ( s t a r t : s t o p ) ;

% d a t a ( : , 4 ) = qq . ∗ cos ( R o l l . d a t a ( s t a r t : s t o p )∗ p i / 1 8 0 )

% d a t a ( : , 4 ) = qk . d a t a ( s t a r t : s t o p ) ;d a t a ( : , 5 ) = vk . d a t a ( s t a r t : s t o p ) + . 5 ;d a t a ( : , 6 ) = a l p h a k . d a t a ( s t a r t : s t o p ) ;

d a t a ( : , 7 ) = qk . d a t a ( s t a r t : s t o p ) ;d a t a ( : , 8 ) = pchk . d a t a ( s t a r t : s t o p ) ;d a t a ( : , 9 ) = e l e v ;d a t a ( : , 1 0 ) = t h r o t l e ;

% d a t a ( : , 1 1 ) = a i l s e r v o . d a t a ( s t a r t : s t o p ) ;nos =4; % No Of S t a t e su=[ e l e v t h r o t l e ] ;% u= [ e l e v ] ;n =2; % no of i n p u t sn o i =2;% no of i n p u t sE l a s t =10ˆ100 ;x i n t =[ d a t a ( 1 , 5 ) ; d a t a ( 1 , 6 ) ; d a t a ( 1 , 7 ) ; d a t a ( 1 , 8 ) ] ;

%% The NEURAL NETWORK method ”HOPFIELD” G e t t i n g W, b

58

Page 72: low cost frame work for parameter identification of unmanned aerial vehicles

f o r i =1: nosf o r j =1: nos

alphaw ( i , j )=sum ( d a t a ( : , nos+ i ) . ∗ d a t a ( : , nos+ j ) ) ;end

endf o r i =1: nos

f o r j =1: n o ibetaw ( i , j )=sum ( d a t a ( : , nos+ i ) . ∗ d a t a ( : , 2 ∗ nos+ j ) ) ;

endendf o r i =1: n o i

f o r j =1: n o it h e t a w ( i , j )=sum ( d a t a ( : , 2 ∗ nos+ i ) . ∗ d a t a ( : , 2 ∗ nos+ j ) ) ;

endendW1=z e r o s ( nos ∗nos , nos ∗ nos ) ;W2=z e r o s ( nos ∗nos , n o i ∗ nos ) ;W4=z e r o s ( nos ∗ noi , nos ∗ n o i ) ;f o r i =1: nos

W1( ( i −1)∗ nos +1: i ∗nos , ( i −1)∗ nos +1: i ∗ nos )= alphaw ;endf o r i =1: nos

W4( ( i −1)∗ n o i +1: i ∗ noi , ( i −1)∗ n o i +1: i ∗ n o i )= t h e t a w ;endf o r i =1: nos

W2( ( i −1)∗ nos +1: i ∗nos , ( i −1)∗ n o i +1: i ∗ n o i )= betaw ;end

W=−[W1 W2W2’ W4] ; % Weigh t ing Mat r i x

f o r j =1: nosf o r i =1: nos

b1 ( i , j )=sum ( d a t a ( : , j ) . ∗ d a t a ( : , nos+ i ) ) ;end

endf o r j =1: nos

f o r i =1: n o ib2 ( i , j )=sum ( d a t a ( : , j ) . ∗ d a t a ( : , 2 ∗ nos+ i ) ) ;

endend

[ mb1 , nb1 ]= s i z e ( b1 ) ;[ mb2 , nb2 ]= s i z e ( b2 ) ;b= −[ r e s h a p e ( b1 , mb1∗nb1 , 1 ) ; r e s h a p e ( b2 , mb2∗nb2 , 1 ) ] ;

% THE BIAS MATRIX %%%%%

59

Page 73: low cost frame work for parameter identification of unmanned aerial vehicles

%%

BETA=[ r e s h a p e (AA’ , ma∗na , 1 ) ; r e s h a p e (BB’ , mb∗nb , 1 ) ] ;B e t a r i g h t =[ r e s h a p e (AA’ , ma∗na , 1 ) ; r e s h a p e (BB’ , mb∗nb , 1 ) ] ;Beta new= B e t a r i g h t ;b e t a d o t =z e r o s ( 1 , 2 4 ) ;Q=z e r o s ( 1 , 2 4 ) ;U=z e r o s ( 1 , 2 4 ) ;

lmda ( 1 : 2 4 )=1 0 ˆ −5 ;% lmda (1 :16)=10ˆ −6 ;

lmda ( 1 3 : 1 6 )=1 0 ˆ −8 ;lmda (3 )=10ˆ −6 ;lmda (4 )=10ˆ −9 ;

lmda (8 )=10ˆ −8 ;lmda (12)=10ˆ −8 ;

lmda (7 )=10ˆ −86 ;% lmda (9 )=10ˆ −6 ;% lmda (11)=10ˆ −6 ;% lmda (18)=10ˆ −6 ;% lmda (22)=10ˆ −6 ;% lmda (21)=10ˆ −6 ;% lmda (5 )=10ˆ −6 ;% lmda (10)=10ˆ −6 ;% lmda (20)=10ˆ −6 ;% lmda (24)=10ˆ −6 ;

% lmda (10)=10ˆ −4 ;f o r LoL=1:520000000

% f o r wew= [ 1 , 2 , 5 , 6 , 7 , 9 , 1 0 , 1 1 , 1 7 , 1 8 , 1 9 ]% f o r wew=[1 : 1 6 , 1 7 , 1 9 , 2 1 , 2 3 ]

f o r wew= [ 1 : 2 4 ]f o r k i t =1: l e n g t h ( k a t )

ro= k a t ( k i t ) ;

B e t a r i g h t =[ r e s h a p e (AA’ , ma∗na , 1 ) ; r e s h a p e (BB’ , mb∗nb , 1 ) ] ;B e t a o l d= B e t a r i g h t ;

BBU= B e t a r i g h t ;

60

Page 74: low cost frame work for parameter identification of unmanned aerial vehicles

BBYT=0∗ B e t a r i g h t ;% B e t a r i g h t ( 2 , 3 ) = 1 ;f i g u r e ( 9 0 9 0 )

f o r i =1: i t e r

f o r s h o f=wew

roo= ro ;U( s h o f )=W( shof , : ) ∗ B e t a o l d+b ( s h o f ) ;Q( s h o f )= ( lmda ( s h o f ) ∗ ( roo ˆ2 −B e t a o l d ( s h o f ) ˆ 2 ) ) / ( 2 ∗ roo) ; % d e r i v a t i v e de l a s igmoid

% Q=( lmda . ∗ ( ro ˆ2∗ ones ( nos ∗ nos+n o i ∗nos ,1) − B e t a o l d . ˆ 2 ) ) / ( 2 ∗ ro ) ;

b e t a d o t ( s h o f )=U( s h o f )∗Q( s h o f ) ;Beta new ( s h o f )= B e t a o l d ( s h o f )+ ( b e t a d o t ( s h o f )∗ d t t ) ;B e t a o l d ( s h o f )= Beta new ( s h o f ) ;

end

Beta=Beta new ;A= r e s h a p e ( Beta ( 1 : nos ∗ nos ) , nos , nos ) ’ ;B= r e s h a p e ( Beta ( nos ∗ nos +1: l e n g t h ( Beta ) ) , n o i , nos ) ’ ;

Remm = rem ( i , s t e p p ) ;i f Remm==0

BBU=[BBU, r e s h a p e ( Beta , nos ∗ nos+n o i ∗ nos , 1 ) ] ;BBYT=[BBYT , r e s h a p e ( b e t a d o t , nos ∗ nos+n o i ∗ nos , 1 ) ] ;

C=[1 0 0 0 ; 0 1 0 0 ; 0 0 1 0 ; 0 0 0 1 ] ;D=[0 0 ; 0 0 ; 0 0 ; 0 0 ] ;

t = e l e v s e r v o . t ime ( s t a r t : s t o p )− e l e v s e r v o . t ime ( s t a r t ) ;

x0 = x i n t ; %i n i t i a l c o n d i t i o n o f sys tem s t a t e s[ y ] = l s i m (A, B , C , D, u , t , x0 ) ; %do t h e s i m u l a t i o n

f o r kuy =1:4u i=abs ( d a t a ( : , kuy+nos )− y ( : , kuy ) ) ;

y t t =sum ( u i ) ;e r r o r ( kuy )= ( ( y t t / sum ( abs ( d a t a ( : , kuy+nos ) ) ) ) ∗ 1 0 0 ) / sample s ;

endE=sum ( e r r o r ) ;

% e r r o r (5 )=E ;p l o t ( i , E )

61

Page 75: low cost frame work for parameter identification of unmanned aerial vehicles

ho ld on

end

endE r r o r ( k i t )=E ;

end%%%%%%%%%%%%%%%%%%%]%&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&% ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ ˆ

wewi f E l a s t > min ( E r r o r )

E l a s t =min ( E r r o r ) ;t o k t o k= f i n d ( E r r o r==min ( E r r o r ) ) ;ro=k a t ( t o k t o k ( 1 ) )[ min ( E r r o r ) wew ]

B e t a r i g h t =[ r e s h a p e (AA’ , ma∗na , 1 ) ; r e s h a p e (BB’ , mb∗nb , 1 ) ] ;B e t a o l d= B e t a r i g h t ;BBU= B e t a r i g h t ;BBYT=0∗ B e t a r i g h t ;% B e t a r i g h t ( 2 , 3 ) = 1 ;f i g u r e ( 9 0 9 0 )

s e t ( gcf , ’ u n i t s ’ , ’ no rma l i zed ’ , ’ o u t e r p o s i t i o n ’ , [ . 4 . 4 . 4 . 6 ] )

f o r i =1: i t e r

f o r s h o f=wew

roo= ro ;U( s h o f )=W( shof , : ) ∗ B e t a o l d+b ( s h o f ) ;Q( s h o f )= ( lmda ( s h o f ) ∗ ( roo ˆ2 −B e t a o l d ( s h o f ) ˆ 2 ) ) / ( 2 ∗ roo

) ; % d e r i v a t i v e de l a s igmoid% Q=( lmda . ∗ ( ro ˆ2∗ abs ( B e t a r i g h t . ˆ 2 ) −B e t a o l d . ˆ 2 ) ) / ( 2 ∗ ro ) ;

% Q=( lmda . ∗ ( ro ˆ2∗ ones ( nos ∗ nos+n o i ∗nos ,1) − B e t a o l d . ˆ 2 ) ) / ( 2 ∗ ro ) ;

b e t a d o t ( s h o f )=U( s h o f )∗Q( s h o f ) ;Beta new ( s h o f )= B e t a o l d ( s h o f )+ ( b e t a d o t ( s h o f )∗ d t t ) ;

B e t a o l d ( s h o f )= Beta new ( s h o f ) ;

62

Page 76: low cost frame work for parameter identification of unmanned aerial vehicles

end

Beta=Beta new ;A= r e s h a p e ( Beta ( 1 : nos ∗ nos ) , nos , nos ) ’ ;

B= r e s h a p e ( Beta ( nos ∗ nos +1: l e n g t h ( Beta ) ) , n o i , nos ) ’ ;

Remm = rem ( i , s t e p p ) ;i f Remm==0

BBU=[BBU, r e s h a p e ( Beta , nos ∗ nos+n o i ∗ nos , 1 ) ] ;BBYT=[BBYT , r e s h a p e ( b e t a d o t , nos ∗ nos+n o i ∗ nos , 1 ) ] ;

C=[1 0 0 0 ; 0 1 0 0 ; 0 0 1 0 ; 0 0 0 1 ] ;D=[0 0 ; 0 0 ; 0 0 ; 0 0 ] ;

t = e l e v s e r v o . t ime ( s t a r t : s t o p )− e l e v s e r v o . t ime ( s t a r t ) ;

x0 = x i n t ; %i n i t i a l c o n d i t i o n o f sys tem s t a t e s[ y ] = l s i m (A, B , C , D, u , t , x0 ) ; %do t h e s i m u l a t i o n

f o r kuy =1:4u i=abs ( d a t a ( : , kuy+4)− y ( : , kuy ) ) ;

y t t =sum ( u i ) ;e r r o r ( kuy )= ( ( y t t / sum ( abs ( d a t a ( : , kuy+nos ) ) ) ) ∗ 1 0 0 ) / sample s ;

endE=sum ( e r r o r ) ;

% e r r o r (5 )=E ;p l o t ( i , E )

ho ld on

end

end

%%

f i g u r e ( 9 9 )

s e t ( gcf , ’ u n i t s ’ , ’ no rma l i zed ’ , ’ o u t e r p o s i t i o n ’ , [ 0 . 2 . 4 . 8 ] )

s u b p l o t ( 5 , 1 , 1 )

63

Page 77: low cost frame work for parameter identification of unmanned aerial vehicles

p l o t ( t , y ( : , 1 ) , ’m’ , ’ l i n e w i d t h ’ , 3 )ho ld onp l o t ( t , d a t a ( : , 5 ) , ’ r ’ , ’ l i n e w i d t h ’ , 3 )ho ld o f ft i t l e ( ’LONG. S t a t e s u \ a l p h a q \ t h e t a ’ )l e g e n d ( ’ e s t i m a t e d ’ , ’ measured ’ )l e g e n d ( ’ boxof f ’ )

s u b p l o t ( 5 , 1 , 2 )p l o t ( t , y ( : , 2 ) ∗ 1 8 0 / pi , ’m’ , ’ l i n e w i d t h ’ , 3 )ho ld onp l o t ( t , d a t a ( : , 2 + 4 ) ∗ 1 8 0 / pi , ’ r ’ , ’ l i n e w i d t h ’ , 3 )

ho ld o f fs u b p l o t ( 5 , 1 , 3 )p l o t ( t , y ( : , 3 ) ∗ 1 8 0 / pi , ’m’ , ’ l i n e w i d t h ’ , 3 )

ho ld onp l o t ( t , d a t a ( : , 3 + 4 ) ∗ 1 8 0 / pi , ’ r ’ , ’ l i n e w i d t h ’ , 3 )

ho ld o f fs u b p l o t ( 5 , 1 , 4 )p l o t ( t , y ( : , 4 ) ∗ 1 8 0 / pi , ’m’ , ’ l i n e w i d t h ’ , 3 )

ho ld onp l o t ( t , d a t a ( : , 4 + 4 ) ∗ 1 8 0 / pi , ’ r ’ , ’ l i n e w i d t h ’ , 3 )

ho ld o f f

s u b p l o t ( 5 , 1 , 5 )p l o t ( t , e l e v ∗ ( 1 8 0 / p i ) , ’ l i n e w i d t h ’ , 3 )ho ld onp l o t ( t , t h r o t l e , ’ r ’ , ’ l i n e w i d t h ’ , 3 )l e g e n d ( ’ e l ev ’ , ’ t h r o t l e ’ )

l e g e n d ( ’ boxof f ’ )ho ld o f f

% p r i n t ( ’ example ’ , ’−dpng ’ , ’− r1000 ’ ) ;%<−Save as PNG wi th 1000 DPI

%%BBB=B ( : , 1 ) ;

C=[0 0 0 1 ] ;D= 0 ;f i g u r e ( 3 )

s e t ( gcf , ’ u n i t s ’ , ’ no rma l i zed ’ , ’ o u t e r p o s i t i o n ’ , [ . 8 . 4 . 2 . 5 ] )

64

Page 78: low cost frame work for parameter identification of unmanned aerial vehicles

AAA=A;Ucons=d a t a ( 1 , 5 ) ;AAA( 2 , 1 )=A( 2 , 1 ) ∗ Ucons ;AAA( 2 , 3 )=A( 2 , 3 ) ∗ Ucons ;AAA( 2 , 4 )=A( 2 , 4 ) ∗ Ucons ;AAA( 1 , 2 )=A( 1 , 2 ) / Ucons ;

AAA( 3 , 2 )=A( 3 , 2 ) / Ucons ;AAA( 4 , 2 )=A( 4 , 2 ) / Ucons ;BBB(2)=BBB( 2 ) ∗ Ucons ;[ bb , aa ] = s s 2 t f (AAA ,BBB, C ,D ) ;

h = t f ( 1 , aa ) ;r l o c u s ( h , 0 )

AA=A;BB=B ;f i g u r e ( 6 1 )p l o t ( LoL , E , ’ ∗ ’ )ho ld on

s e t ( gcf , ’ u n i t s ’ , ’ no rma l i zed ’ , ’ o u t e r p o s i t i o n ’ , [ . 4 . 1 0 . 6 . 3 ] )

endend

end

65

Page 79: low cost frame work for parameter identification of unmanned aerial vehicles

Appendix B: ArduImu V3 CodesThi s i s n o t a l l t h e code t h i s i s a sample from i t where you c o u l df i n d t h e changes i n t h i s code from t h e w e b s i t e code .These changes where made t o g e t t h e r i g h t a n g l e s .H in t when u p l a o d i n g a code f o r Arduimu use t h e f o c a

and c o n n e c t w i th i t s s e r i a l p o r t s and when u p l a o d i n gt h e code p r e s s on t h e r e s e t p u t t o n i n arduimu and s t i l l p r e s s i n gt i l l t h e up lo ad f i n i s h

Here i s t h e l i n e s t h a t need t o be changed :

/ / I use t h i s web : h t t p : / / www. magnet ic − d e c l i n a t i o n . com /

# d e f i n e MAGNETIC DECLINATION 4.13333/ / c o r r e c t s m a g n e t i c b e a r i n g t o t r u e n o r t h

/ / LPR530 & LY530 S e n s i t i v i t y ( from d a t a s h e e t )/ / => 3 . 3 3mV/ / s , 3 . 2 2mV/ADC s t e p => 1 . 0 3

/ / T e s t e d v a l u e s : 0 . 9 6 , 0 . 9 6 , 0 . 9 4# d e f i n e Gyro Gain X 0 . 9 2 / / X a x i s Gyro g a i n# d e f i n e Gyro Gain Y 0 . 9 2 / / Y a x i s Gyro g a i n# d e f i n e Gyro Gain Z 0 . 9 4 / / Z a x i s Gyro g a i n# d e f i n e Gyro Sca led X ( x ) x∗ToRad ( Gyro Gain X )/ / Re tu rn t h e s c a l e d/ / ADC raw d a t a o f t h e gyro i n r a d i a n s f o r second# d e f i n e Gyro Sca led Y ( x ) x∗ToRad ( Gyro Gain Y )/ / Re tu rn t h e s c a l e d ADC/ / raw d a t a o f t h e gyro i n r a d i a n s f o r second# d e f i n e Gyro Sca l ed Z ( x ) x∗ToRad ( Gyro Gain Z )

/ / Re tu rn t h e s c a l e d ADC raw d a t a/ / of t h e gyro i n r a d i a n s f o r second

# e n d i f

# i f BOARD VERSION == 3# d e f i n e SERIAL MUX PIN 7# d e f i n e RED LED PIN 5# d e f i n e BLUE LED PIN 6# d e f i n e YELLOW LED PIN 5

/ / Yellow l e d i s n o t used on ArduIMU v3/ / MPU6000 4g r a n g e => g = 8192# d e f i n e GRAVITY 8192

/ / Thi s e q u i v a l e n t t o 1G i n t h e/ / raw d a t a coming from t h e a c c e l e r o m e t e r

66

Page 80: low cost frame work for parameter identification of unmanned aerial vehicles

# d e f i n e A c c e l S c a l e ( x ) x ∗ (GRAVITY / 9 . 8 1 )/ / S c a l i n g t h e raw d a t a o f t h e a c c e l/ / t o a c t u a l a c c e l e r a t i o n i n m e t e r s f o r s e c o n d s s q u a r e

/ / MPU6000 s e n s i b i l i t y ( t h e o r i c a l 0 .0152/ /= > 1 / 6 5 . 6 LSB / deg / s a t 500 deg / s ) ( t h e o r i c a l 0 .0305/ / => 1 / 3 2 . 8 LSB / deg / s a t 1000 deg / s )/ / ( 0 .0609 => 1 / 1 6 . 4 LSB / deg / s a t 2000 deg / s )# d e f i n e Gyro Gain X 0 .0609# d e f i n e Gyro Gain Y 0 .0609# d e f i n e Gyro Gain Z 0 .0609

Add t h e s e l i n e s t o t r a n s f e r d a t a between Arduimu and Arduino

/ / Wire . b e g i n T r a n s m i s s i o n (SLAVE ADDRESS&SLAVE ADDRESS2 ) ;Wire . b e g i n T r a n s m i s s i o n (SLAVE ADDRESS ) ;

I 2 C w r i t e A n y t h i n g (X ) ;I 2 C w r i t e A n y t h i n g (Y ) ;I 2 C w r i t e A n y t h i n g ( Z ) ;I 2 C w r i t e A n y t h i n g ( xgyro ) ;I 2 C w r i t e A n y t h i n g ( ygyro ) ;I 2 C w r i t e A n y t h i n g ( zgyro ) ;I 2 C w r i t e A n y t h i n g ( x a c c e l ) ;I 2 C w r i t e A n y t h i n g ( y a c c e l ) ;

Wire . e n d T r a n s m i s s i o n ( ) ;/ / d e l a y ( 2 0 0 0 ) ;

Wire . b e g i n T r a n s m i s s i o n (SLAVE ADDRESS2 ) ;I 2 C w r i t e A n y t h i n g ( z a c c e l ) ;

I 2 C w r i t e A n y t h i n g ( l a t ) ;I 2 C w r i t e A n y t h i n g ( l o n ) ;I 2 C w r i t e A n y t h i n g ( g p s a l t ) ;I 2 C w r i t e A n y t h i n g ( gpsgc ) ;I 2 C w r i t e A n y t h i n g ( gpsgs ) ;I 2 C w r i t e A n y t h i n g ( f i x ) ;I 2 C w r i t e A n y t h i n g ( numsat ) ;

Wire . e n d T r a n s m i s s i o n ( ) ;

d e l a y ( 1 5 ) ;

}

67

Page 81: low cost frame work for parameter identification of unmanned aerial vehicles

IN DCM use t h e s e l i n e s

vo id E u l e r a n g l e s ( vo id ){

# i f (OUTPUTMODE==2)/ / Only a c c e l e r o m e t e r i n f o/ / ( debugg ing p u r p o s e s )

r o l l = (1∗ a t a n 2 ( A c c e l V e c t o r [ 1 ] , A c c e l V e c t o r [ 2 ] ) ) ;/ / ∗5 7 . 2 9 5 7 7 9 5 1 3 1 ; / / a t a n 2 ( acc y , a c c z )/ / r o l l = ( −0 .2 74∗ ( r o l l ˆ 2 ) ) + ( 6 . 6 2 3 9 ∗ r o l l ) + 0 . 2 5 3 7 ;

p i t c h = (−1∗ a s i n ( ( A c c e l V e c t o r [ 0 ] ) / ( do ub l e )GRAVITY ) ) ;/ / ∗5 7 . 2 9 5 7 7 9 5 1 3 1 ; / / a s i n ( a c c x )/ / p i t c h = ( 0 . 0 0 0 4 ∗ ( p i t c h ˆ 2 ) ) + ( 0 . 2 5 6 4 ∗ p i t c h ) + 0 . 1 9 5 3 ;yaw = 0 ;

# e l s ep i t c h = −1∗ a s i n ( DCM Matrix [ 2 ] [ 0 ] ) ;r o l l = 1∗ a t a n 2 ( DCM Matrix [ 2 ] [ 1 ] , DCM Matrix [ 2 ] [ 2 ] ) ;yaw = a t a n 2 ( DCM Matrix [ 1 ] [ 0 ] , DCM Matrix [ 0 ] [ 0 ] ) ;

# e n d i f}

68

Page 82: low cost frame work for parameter identification of unmanned aerial vehicles

Appendix C: Arduino Mega code

Thi s code i s u p l o a d e d i n t t h e a r d u i n o megam i c r o c o n t r o l l e r c h i p so as t o c o l l e c t a l l t h e s e n s o r sd a t a and save them .

# i n c l u d e < F a s t S e r i a l . h>F a s t S e r i a l P o r t 0 ( S e r i a l ) ;# i n c l u d e <Wire . h># i n c l u d e <PID v1 . h># i n c l u d e < I2C Anyth ing . h># i n c l u d e <BMP085 . h> / / a l t i t u d e s e n s o rc o n s t f l o a t AIRSPEED CH = 3 ; / / f o r a i r s p e e d s e n s o r

# i n c l u d e <Servo . h>

# i n c l u d e ” Arduino . h ”# i n c l u d e ” R e c e i v e r . h ”# i n c l u d e ” R e c e i v e r M a i n . h ”

Servo e l e v s e r v o ; / / c r e a t e s e r v o o b j e c t t o c o n t r o l a s e r v oServo a i l s e r v o ;Servo r u d s e r v o ;Servo t h r o t t l e ;f l o a t e l e v s ;f l o a t e l e v s s ; / / v a r i a b l e t o s t o r e t h e s e r v o p o s i t i o nf l o a t a i l s ;f l o a t a i l s s ;f l o a t r u d s ;f l o a t r u d s s ;f l o a t t h r o t s = 0 ;f l o a t a l p h a = 0 ; / / t h e s e f o r p o t e n t i o m e t e r sf l o a t b e t a =0;i n t a i l e r o n =0;i n t r u d d e r =0;i n t f , a l t , a l t 2 , H R ;f l o a t a l p h a s t ;f l o a t b e t a s t ;

c o n s t b y t e MY ADDRESS = 4 2 ;c o n s t b y t e MY ADDRESS2= 2 7 ;

69

Page 83: low cost frame work for parameter identification of unmanned aerial vehicles

f l o a t V, a i r s p e e d , r e f p r e s s u r e , a i r p r e s s u r e , p r e s s u r e d i f f ;f l o a t a i r s p e e d r a t i o =1 . 6 3 3 ;i n t i ;f l o a t t ime =0;

f l o a t X;f l o a t Y;f l o a t Z ;f l o a t xgyro ;f l o a t ygyro ;f l o a t zgyro ;f l o a t x a c c e l ;f l o a t y a c c e l ;f l o a t z a c c e l ;f l o a t l a t ;f l o a t l o n ;f l o a t g p s a l t ;f l o a t gpsgc ;

v o l a t i l e f l o a t gpsgs ;v o l a t i l e d ou b l e f i x ;v o l a t i l e d ou b l e numsat ;i n t ADD;f l o a t v = 0 ;f l o a t x = 0 ;f l o a t y = 0 ;f l o a t z = 0 ;f l o a t Vfss = 3 . 5 ;f l o a t Vs = 5 . 0 ;

u n s i g n e d long Ms swi tch ;u n s i g n e d long M s E l e v a t o r ;u n s i g n e d long Ms Ai le ron ;u n s i g n e d long Ms Rudder ;u n s i g n e d long M s T h r o t t l e ;

/ / i n t S w i t c h p i n =10;/ / i n t E l e v a t o r p i n =2;/ ∗ i n t A i l e r o n p i n =6;i n t R u d d e r p i n =4;i n t T h r o t t l e p i n = 2 ;∗ /

BMP085 bmp ;vo id s e t u p ( ){

S e r i a l . b e g i n (111111 , 128 , 1 6 ) ;a l p h a s t = ana logRead ( 1 ) ∗ 9 0 / 2 7 ;

70

Page 84: low cost frame work for parameter identification of unmanned aerial vehicles

b e t a s t = ana logRead ( 2 ) ∗ 9 0 / 2 7 ;

i n i t i a l i z e r e c e i v e r ( ) ;

bmp . b e g i n ( ) ;H R =bmp . r e a d A l t i t u d e ( 1 0 1 3 2 5 ) ;

//−−−−−−−−−−

/ / SETTING REFERANCE VALUES FOR PRESSURE / /

r e f p r e s s u r e = ( ( ( ( ana logRead ( AIRSPEED CH )∗ 5 . 0 / 1 0 2 4 . 0 ) − . 0 6 2 5 ∗ 4 . 0 0 ) / 5 . 0 ) − . 5 ) / . 2 ;

f o r ( i =1; i <=200; i ++){

r e f p r e s s u r e = ( ( ( ( ana logRead ( AIRSPEED CH ) ∗ 5 . 0 / 1 0 2 4 . 0 )− . 0 6 2 5 ∗ 4 . 0 0 ) / 5 . 0 ) − . 5 ) / . 2 ∗ 0 . 2 5 + r e f p r e s s u r e ∗ 0 . 7 5 ;

d e l a y ( 2 0 ) ;}

/ / SERVOS PINS / /

a i l s e r v o . a t t a c h ( 2 ) ;e l e v s e r v o . a t t a c h ( 3 ) ; / / c r e a t e s e r v o o b j e c t t o c o n t r o l a s e r v or u d s e r v o . a t t a c h ( 4 ) ;t h r o t t l e . a t t a c h ( 5 ) ;//−−−−−−−−−−−−−−−−−−−−−−−−−−−−

/ ∗

The ta = Y;Theta R =0;C o n t o u t p u t =90; / / EleVservo . w r i t e ( 9 0 ) ;/ / n e u t r a l p o s i t i o n o f t h e s e r v o

P i t c h P I D . SetMode (AUTOMATIC ) ; / / t u r n t h e PID onP i t c h P I D . S e t O u t p u t L i m i t s ( 0 , 1 8 0 ) ;

∗ /

} / / end of s e t u p

vo id loop ( ){

//−−−−−−−−−−

/ / STICK TO AUTOPILOT SWITCH / /

/ / Wire . e n d T r a n s m i s s i o n ( ) ;

71

Page 85: low cost frame work for parameter identification of unmanned aerial vehicles

/ / bmp . b e g i n ( ) ;a l t = bmp . r e a d A l t i t u d e ( 1 0 1 3 2 5 ) ;

i f ( abs ( a l t − a l t 2 ) >1 0 0 | | abs ( a l t − a l t 2 ) <2){

a l t = a l t 2 ;}

r e a d r e c e i v e r ( ) ;Ms Ai le ron= r ece ive r command [ 0 ] ;M s E l e v a t o r= r ece ive r command [ 1 ] ;Ms Rudder= r ece ive r command [ 2 ] ;M s T h r o t t l e= r ece ive r command [ 3 ] ;Ms swi tch= r ece ive r command [ 4 ] ;

a i l s =Ms Ai le ron ;e l e v s =M s E l e v a t o r ;r u d s=Ms Rudder ;t h r o t s =M s T h r o t t l e ;

e l e v s e r v o . w r i t e ( e l e v s ) ;a i l s e r v o . w r i t e ( a i l s ) ;r u d s e r v o . w r i t e ( r u d s ) ;t h r o t t l e . w r i t e ( t h r o t s ) ;i f ( r u d s < 1348){

r u d s s = .0555∗ ruds −7 3 . 3 3 5 ;}

e l s e i f ( ruds >= 1348){

r u d s s = .0333∗ ruds −4 5 . 4 3 4 ;}

e l e v s s = .0756∗ e l e v s −1 0 8 . 7 2 ;/ / e l e v s s = .0748∗ e l e v s −1 0 8 . 6 ;a i l s s = .0738∗ a i l s −1 1 2 . 2 9 ;

/ / AIRSPEED SENSOR CALIBRATIONi f ( ( m i l l i s ( ) − t ime ) >= 20){

t ime = m i l l i s ( ) ;

a i r p r e s s u r e = ( ( ( ( ana logRead ( AIRSPEED CH ) ∗ 5 . 0 / 1 0 2 4 . 0 )

72

Page 86: low cost frame work for parameter identification of unmanned aerial vehicles

− . 0 6 2 5 ∗ 4 . 0 0 ) / 5 . 0 ) − . 5 ) / . 2 ∗ 0 . 2 5 + a i r p r e s s u r e ∗ 0 . 7 5 ;

i f ( a i r p r e s s u r e >= r e f p r e s s u r e ){

p r e s s u r e d i f f = a i r p r e s s u r e − r e f p r e s s u r e ;}

e l s e{

p r e s s u r e d i f f = 0 . 0 ;}

a i r s p e e d = s q r t ( p r e s s u r e d i f f ∗1000∗ a i r s p e e d r a t i o ) ;V= a i r s p e e d ∗ 1 8 / 5 ;

}

/ / f o r ( i =1; i <=300; i ++)/ / {

/ / a l p h a =a l p h a+ ana logRead (1 )∗90 /27 − a l p h a s t ;/ / b e t a = b e t a+ ana logRead (2 )∗90 /27 − b e t a s t ;

/ / d e l a y ( 1 ) ;/ / }

/ / a l p h a =( a l p h a ) / 3 0 0 ;/ / b e t a =( b e t a ) / 3 0 0 ;a l p h a = ana logRead (1 )∗90 /27 − a l p h a s t ;

b e t a = ana logRead (2 )∗90 /27 − b e t a s t ;

/ / d e l a y ( 1 0 ) ;//−−−

/ / I2C t o r e a d more t h a n 8 v a r i a b l e s / /

i f (ADD == 1){

Wire . b e g i n (MY ADDRESS ) ;Wire . onRece ive ( r e c e i v e E v e n t ) ;ADD=0;

}

e l s e i f (ADD == 0){

Wire . b e g i n (MY ADDRESS2 ) ;Wire . onRece ive ( r e c e i v e E v e n t 2 ) ;ADD=1;

73

Page 87: low cost frame work for parameter identification of unmanned aerial vehicles

}

//−−−

/ / SERIAL MONITORING / /

S e r i a l . p r i n t ( ”RLL : ” ) ;S e r i a l . p r i n t (X ) ;S e r i a l . p r i n t ( ” ,PCH : ” ) ;S e r i a l . p r i n t (Y ) ;S e r i a l . p r i n t ( ” ,YAW: ” ) ;S e r i a l . p r i n t ( Z ) ;

S e r i a l . p r i n t ( ” , a l t i t u d e : ” ) ;S e r i a l . p r i n t ( a l t −H R ) ;

S e r i a l . p r i n t ( ” , v e l o c i t y : ” ) ;S e r i a l . p r i n t (V ) ;

S e r i a l . p r i n t ( ” , a l p h a : ” ) ;S e r i a l . p r i n t ( a l p h a ) ;

S e r i a l . p r i n t ( ” , b e t a : ” ) ;S e r i a l . p r i n t ( b e t a ) ;

S e r i a l . p r i n t ( ” ,AN0 : ” ) ;S e r i a l . p r i n t ( xgyro ) ;S e r i a l . p r i n t ( ” ,AN1 : ” ) ;S e r i a l . p r i n t ( ygyro ) ;S e r i a l . p r i n t ( ” ,AN2 : ” ) ;S e r i a l . p r i n t ( zgyro ) ;S e r i a l . p r i n t ( ” ,AN3 : ” ) ;S e r i a l . p r i n t ( x a c c e l ) ;S e r i a l . p r i n t ( ” ,AN4 : ” ) ;S e r i a l . p r i n t ( y a c c e l ) ;S e r i a l . p r i n t ( ” ,AN5 : ” ) ;S e r i a l . p r i n t ( z a c c e l ) ;

S e r i a l . p r i n t ( ” , l a t : ” ) ;S e r i a l . p r i n t ( l a t , 5 ) ;S e r i a l . p r i n t ( ” , l o n : ” ) ;S e r i a l . p r i n t ( lon , 5 ) ;d e l a y ( 5 0 ) ;

/ / S e r i a l . p r i n t ( ” ,GALT : ” ) ;/ / S e r i a l . p r i n t ( g p s a l t ) ;

S e r i a l . p r i n t ( ” , gpsgc : ” ) ;S e r i a l . p r i n t ( gpsgc ) ;S e r i a l . p r i n t ( ” , gpsgs : ” ) ;S e r i a l . p r i n t ( gpsgs ) ;

74

Page 88: low cost frame work for parameter identification of unmanned aerial vehicles

S e r i a l . p r i n t ( ” , f i x : ” ) ;S e r i a l . p r i n t ( f i x ) ;S e r i a l . p r i n t ( ” , numsat : ” ) ;S e r i a l . p r i n t ( numsat ) ;

S e r i a l . p r i n t ( ” , e l e v s : ” ) ;S e r i a l . p r i n t ( e l e v s s ) ;

S e r i a l . p r i n t ( ” , a i l s : ” ) ;S e r i a l . p r i n t ( a i l s s ) ;S e r i a l . p r i n t ( ” , r d r s : ” ) ;/ / S e r i a l . p r i n t ( r u d s ) ;S e r i a l . p r i n t ( r u d s s ) ;

S e r i a l . p r i n t ( ” , t h r o t s : ” ) ;S e r i a l . p r i n t ( t h r o t s −1009) ;

S e r i a l . p r i n t l n ( ” , 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ” ) ;

a l t 2 = a l t ;/ / d e l a y ( 5 ) ;

} / / end of loop

/ / c a l l e d by i n t e r r u p t s e r v i c e/ / r o u t i n e when incoming d a t a a r r i v e s

vo id r e c e i v e E v e n t ( i n t howMany ){

I 2 C r e a d A n y t h i n g (X ) ;I 2 C r e a d A n y t h i n g (Y ) ;I 2 C r e a d A n y t h i n g ( Z ) ;I 2 C r e a d A n y t h i n g ( xgyro ) ;I 2 C r e a d A n y t h i n g ( ygyro ) ;I 2 C r e a d A n y t h i n g ( zgyro ) ;I 2 C r e a d A n y t h i n g ( x a c c e l ) ;I 2 C r e a d A n y t h i n g ( y a c c e l ) ;

}

75

Page 89: low cost frame work for parameter identification of unmanned aerial vehicles

vo id r e c e i v e E v e n t 2 ( i n t howMany ){

I 2 C r e a d A n y t h i n g ( z a c c e l ) ;I 2 C r e a d A n y t h i n g ( l a t ) ;I 2 C r e a d A n y t h i n g ( l o n ) ;I 2 C r e a d A n y t h i n g ( g p s a l t ) ;I 2 C r e a d A n y t h i n g ( gpsgc ) ;I 2 C r e a d A n y t h i n g ( gpsgs ) ;I 2 C r e a d A n y t h i n g ( f i x ) ;I 2 C r e a d A n y t h i n g ( numsat ) ;

}

∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗

RECEIVER . h CODE∗∗∗∗∗∗∗∗∗∗∗∗

# i f n d e f R e c e i v e r H# d e f i n e R e c e i v e r H

# d e f i n e N Channels 5

i n t r e c e i v e r r a w d a t a ( b y t e c h a n n e l ) ;i n t r ece ive r command [ N Channels ] = { 0 , 0 , 0 , 0 , 0 } ;i n t MIN[ N Channels ]= {11 00 , 1152 , 1124 , 1 1 0 0 , 1 5 0 0 } ;i n t MAX[ N Channels ]= {19 56 , 1904 , 1904 , 1 9 6 0 , 2 0 0 0 } ;/ / i n t r e c e i v e r z e r o [ N Channels ]= {1 5 0 0 , 1 0 0 0 , 1 5 0 0 , 1 5 0 0 , 1 0 0 0 , 1 0 0 0 } ;

vo id r e a d r e c e i v e r ( ) {f o r ( b y t e i =0; i < N Channels ; i ++) {

r ece ive r command [ i ] = map ( r e c e i v e r r a w d a t a ( i ) ,MIN[ i ] ,MAX[ i ] , 1 0 0 0 , 2 0 0 0 ) ;}

}

# e n d i f

∗∗∗∗∗∗∗∗∗∗∗

R e c e i v e r m a i n . h

∗∗∗∗∗∗∗∗∗∗∗∗∗

# i f n d e f Re ce iv e r M a in H# d e f i n e Re ce iv e r M a in H

/ ∗MIN ON = 1000 MAX OFF = 20000

76

Page 90: low cost frame work for parameter identification of unmanned aerial vehicles

MAX ON = 2000 Min OFF = 19000

∗ /

# d e f i n e RISING 1# d e f i n e FALLING 0# d e f i n e MAX ON 2030 / / Max on s t a t e o ft h e p u l s e wid th due t o n o i s e# d e f i n e MIN ON 980 / / Min on s t a t e o f t h e

p u l s e wid th due t o n o i s e# d e f i n e MAX OFF 20050 / / ( 2 0∗1 0 0 0 ) + 4000

/ / Max o f f s t a t e o f t h e p u l s e wid th due t o n o i s e# d e f i n e MIN OFF 18000 / / ( 1 9∗1 0 0 0 ) − 4000

/ / Min o f f s t a t e o f t h e p u l s e wid th due t o n o i s e

s t r u c t p {

b y t e edge ;u n s i g n e d long r i s e t i m e ;u n s i g n e d long f a l l t i m e ;u n s i g n e d i n t wid th ;

} ;

v o l a t i l e s t a t i c b y t e p o r t h i s t o r y ;v o l a t i l e s t a t i c p p i n d a t a [ N Channels ] ;

/ / Array of s t r u c t s which w i l l g e t t h e d a t ao f t h e r e c e i v e r p i n s

v o l a t i l e s t a t i c b y t e change ;

s t a t i c b y t e r e c e i v e r p i n [ 5 ] = { 0 , 1 , 2 , 3 , 4 } ;/ ∗ Used f o r mapping between rec i eve r commanda r r a y & b i t p l a c e m e n t on PORTC r e g i s t e r i n s i d e

t h e i n t e r r u p t r o u t i n e/ / PORTB PCINT 0 : 4 / On a r d u i n o p i n f u n c t i o n 53−52−51−50−10 /

−> on a r d u i n o board ∗ /

SIGNAL ( PCINT0 vect ) { / / I n t e r r u p t R o u n t i n eb y t e c u r r e n t ;b y t e b i t ;/ / b y t e change ;/ / cahnge i s t h e PCINT p i n s t h a t ’ ve cahngedb y t e p i n ; / / p i n i s t h e p i n t h a t has changed

77

Page 91: low cost frame work for parameter identification of unmanned aerial vehicles

i n t h e p r o c e s su n s i g n e d long c u r r e n t t i m e ;u n s i g n e d long t ime ;/ / t ime i s t h e d u r a t i o n t h a t p u l s e wid th on s t a t u s

c u r r e n t = PINB ; / / ∗ p o r t I n p u t R e g i s t e r ( 2 ) ; / ∗ In a r d u i n o we can ’ t a c c e s s PINK d i r e c t l y i n s t e a d we c o u l d used b u i l t − i n macro t o a c c e s PORTK/ / which i s 11 t h p o r t i n ATMEGA 2560/ / by a c c e s s i n g 11 t h e l e m e n t i n t h e a r r a y ∗ /

change = c u r r e n t ˆ p o r t h i s t o r y ;/ / g e t change between l a s t i n t t e r u p t/ / and t h e c u r r e n t PORT s t a t u s

p o r t h i s t o r y = c u r r e n t ;/ / s t o r e c u r r e n t PORT s t a t u s i n t o l a s t/ / i n t t e r u p t f o r n e x t o p e r a t i o n

c u r r e n t t i m e = mi c r os ( ) ;

f o r ( b y t e i =0; i <N Channels ; i ++) {/ / l o o p i n g f o r e v e r y p i n t o g e t t h e/ / p i n t h a t has changed

b i t = 0x01 << i ;i f ( b i t & change ) {/ / check whe the r t h e c u r r e n t p i n/ / has changed or n o t

p i n = i ;

i f ( b i t & p o r t h i s t o r y ) {t ime = c u r r e n t t i m e − p i n d a t a [ p i n ] . f a l l t i m e ;

/ / check i f i t ’ s from LOW t o HIGHp i n d a t a [ p i n ] . r i s e t i m e = c u r r e n t t i m e ;i f ( ( t ime >= MIN OFF ) && ( t ime <= MAX OFF ) )p i n d a t a [ p i n ] . edge = RISING ;e l s ep i n d a t a [ p i n ] . edge = FALLING ;

}

e l s e {

/ / Check i f i t ’ s from/ / HIGH t o LOWt ime = c u r r e n t t i m e − p i n d a t a [ p i n ] . r i s e t i m e ;

p i n d a t a [ p i n ] . f a l l t i m e = c u r r e n t t i m e ;i f ( ( t ime >= MIN ON) && ( t ime <= MAX ON) &&

( p i n d a t a [ p i n ] . edge == RISING ) ) {p i n d a t a [ p i n ] . w id th = t ime ;

78

Page 92: low cost frame work for parameter identification of unmanned aerial vehicles

/ / Get t ime of ”ON” p a r t o f t h e p u l s e wid thp i n d a t a [ p i n ] . edge = FALLING ;

}

}

}

}

}

vo id i n i t i a l i z e r e c e i v e r ( ) {DDRB=0;PCICR = B00000001 ;

/ / Enab le i n t e r r u p t i n INT0PCMSK0 = B00011111 ;

/ / Enab le i n t e r u p r t on p i n s 6 t o 9 (PWM) on PCINT2s e i ( ) ;

/ / Enab le Gl ob a l I n t e r r u p t}

i n t r e c e i v e r r a w d a t a ( b y t e c h a n n e l ) {b y t e p i n = r e c e i v e r p i n [ c h a n n e l ] ;b y t e temp = SREG ;

/ / Save t h e s t a t u s r e g i s t e r b e f o r e/ / a c c e s s i n g t h e s h a r e d v a r i a b l e s by t h e/ / i n t t e r u p t s

c l i ( ) ;/ / D i s a b l e t h e G lo ba l i n t e r r u p t

u n s i g n e d i n t r a w d a t a = p i n d a t a [ p i n ] . w id th ;/ / Get p u l s e wid th o f c h a n n e l i a t t a c h e d t o/ / ” p i n ” u s i n g s h a r e d v a r i a b l e s

SREG = temp ;/ / r e t u r n t h e s t a t u s r e g i s t e r and e n a b l i n g t h e/ / g l o b a l i n t e r r u p t a g a i nr e t u r n r a w d a t a ;

}

# e n d i f

79

Page 93: low cost frame work for parameter identification of unmanned aerial vehicles

Appendix D: Simulink GCS

Thi s code J u s t s e p a r a t e t h e d a t a t h a t was comingas one row i n t o v a r i a b l e s . Each v a r i a b l e c o n t a i n av e c t o r o f a one d a t a wi th t ime .

% f u n c t i o n [ r o l l , p i t c h , yaw , h e i g h t , v e l o c i t y , e l e v p o t ,a i l p o t , r d r p o t , a l p h a p o t , b e t a p o t , p , q , r ,ax , ay , az , l a t , lon , gpsgc , gpsgs , f i x , numsat ,e l e v s e r v o , a i l s e r v o , r d r s e r v o , t h r o t s e r v o , s t o p ] = f c n ( u )f u n c t i o n [ r o l l , p i t c h , yaw , h e i g h t , v e l o c i t y a ,v e l o c i t y b , a lphaa , a lphab , b e t a a ,be tab , p , q , r , ax , ay , az , l a t , lon , gpsgc ,gpsgs , f i x , numsat , e l e v s e r v o , a i l s e r v o ,r d r s e r v o , t h r o t s e r v o , s t o p ] = f c n ( u )

%% i n i t i a l i z eu=do ub l e ( u ) ;r o l l = do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;p i t c h = do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;yaw= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;h e i g h t = do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;v e l o c i t y a = do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;v e l o c i t y b = do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;a l p h a a=do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;a l p h a b=do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;b e t a a= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;b e t a b= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;p= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;q= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;r= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;ax= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;ay= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;az= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;l a t = do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;l o n= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;gpsgc= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;gpsgs= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;f i x = do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;numsat= do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;e l e v s e r v o =do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;a i l s e r v o = do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;r d r s e r v o = do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;

80

Page 94: low cost frame work for parameter identification of unmanned aerial vehicles

t h r o t s e r v o =do ub l e ( [ 0 0 0 0 0 0 0 ] ) ;s t o p=do ub l e ( 0 ) ;

% % g1 =0; g2 =0; g3 =0; g4 =0; g5 =0; g6 =0; g7 =0; g8 =0; g9 =0;% % g10 =0; g11 =0; g12 =0; g13 =0; g14 =0; g15 =0; g16 =0; g17 =0; g18 =0;% % g19 =0; g20 =0; g21 =0; g22 =0; g23 =0; g24 =0; g25 =0;%% D i v i d i n g t h e o u t p u t sf o r n t =1: l e n g t h ( u ) %1

i f ( u ( n t )==13)s t o p=n t ;b r e a k ;% g1=nt −1;

endendf o r ns =1: l e n g t h ( u)−14%2

i f ( ns<n t&&u ( ns)==44&&u ( ns+1)==116&&u ( ns +2)==104&&u ( ns+3)==114&&u ( ns+4)==111&&u ( ns+5)==116&&u( ns+6)==115&&u ( ns +7)==58)

t h r o t s e r v o =u ( ns +8: ns +1 4 ) ;% g2=ns −1;b r e a k

endendf o r n r =1: l e n g t h ( u)−12%3

i f ( nr<n t&&u ( nr )==44&&u ( nr +1)==114&&u ( nr +2)==

100&&u ( nr +3)==114&&u ( nr +4)==115&&u ( nr +5)==58)

r d r s e r v o =u ( n r +6: n r +12 ) ;b r e a k% g3=nr −1;

endendf o r nq =1: l e n g t h ( u)−12%4

i f ( nq<n t&&u ( nq)==44&&u ( nq+1)==97&&u ( nq+2)==

105&&u ( nq+3)==108&&u ( nq+4)==115&&u ( nq +5)==58)

a i l s e r v o =u ( nq +6: nq +12 ) ;b r e a k% g4=nq −1;

endendf o r np =1: l e n g t h ( u)−13%5

i f ( np<n t&&u ( np)==44&&u ( np+1)==101&&u ( np+2)==108&&u ( np+3)==101&&u ( np+4)==118&&u ( np+5)==115&&u ( np +6)==58)

e l e v s e r v o =u ( np +7: np +1 3 ) ;b r e a k% g5=np −1;

81

Page 95: low cost frame work for parameter identification of unmanned aerial vehicles

endendf o r no =1: l e n g t h ( u)−14%6

i f ( no<n t&&u ( no)==44&&u ( no+1)==110&&u ( no+2)==117&&u ( no+3)==109&&u ( no+4)==115&&u ( no+5)==97&&u ( no+6)==116&&u ( no +7)==58)

numsat=u ( no +8: no +1 4 ) ;b r e a k% g6=no −1;

endendf o r nn =1: l e n g t h ( u)−11%7

i f ( nn<n t&&u ( nn)==44&&u ( nn+1)==102&&u ( nn+2)==105&&u ( nn+3)==120&&u ( nn +4)==58)

f i x =u ( nn +5: nn +11 ) ;b r e a k% g7=nn −1;

endendf o r nm=1: l e n g t h ( u)−13%8

i f (nm<n t&&u (nm)==44&&u (nm+1)==103&&u (nm+2)==112&&u (nm+3)==115&&u (nm+4)==103&&u (nm+5)==115&&u (nm+6)==58)

gpsgs=u (nm+7:nm+13 ) ;b r e a k% g8=nm−1;

endend

f o r n l =1: l e n g t h ( u)−13%9i f ( nl <n t&&u ( n l )==44&&u ( n l +1)==103&&u ( n l +2)==11

2&&u ( n l +3)==115&&u ( n l +4)==103&&u ( n l +5)==99&&u ( n l +6)==58)

gpsgc=u ( n l +7: n l +13 ) ;b r e a k% g9=nl −1;

endend

f o r nk =1: l e n g t h ( u)−11%10i f ( nk<n t&&u ( nk)==44&&u ( nk+1)==108&&u ( nk+2)==111&&u ( nk+3)==110&&u ( nk +4)==58)

l o n=u ( nk +5: nk +1 1 ) ;b r e a k% g10=nk −1;

end

82

Page 96: low cost frame work for parameter identification of unmanned aerial vehicles

endf o r n j =1: l e n g t h ( u)−11%11

i f ( n j <n t&&u ( n j )==44&&u ( n j +1)==108&&u ( n j +2)==97&&u ( n j +3)==116&&u ( n j +4)==58)

l a t =u ( n j +5: n j +11 ) ;b r e a k% g11=nj −1;

endend

f o r n i =1: l e n g t h ( u)−11%12i f ( n i <n t&&u ( n i )==44&&u ( n i +1)==65&&u ( n i +2)==78&

&u ( n i +3)==53&&u ( n i +4)==58)az=u ( n i +5: n i +1 1 ) ;b r e a k% g12=ni −1;

endend

f o r nh =1: l e n g t h ( u)−11%13i f ( nh<n t&&u ( nh)==44&&u ( nh+1)==65&&u ( nh+2)==78&

&u ( nh+3)==52&&u ( nh +4)==58)ay=u ( nh +5: nh +1 1 ) ;b r e a k% g13=nh −1;

endendf o r ng =1: l e n g t h ( u)−11%14

i f ( ng<n t&&u ( ng)==44&&u ( ng+1)==65&&u ( ng+2)==78&&u ( ng+3)==51&&u ( ng +4)==58)

ax=u ( ng +5: ng +1 1 ) ;b r e a k% g14=ng −1;

endend

f o r n f =1: l e n g t h ( u)−11%15i f ( nf<n t&&u ( nf )==44&&u ( nf +1)==65&&u ( nf +2)==78

&&u ( nf +3)==50&&u ( nf +4)==58)r=u ( n f +5: n f +1 1 ) ;b r e a k% g15=nf −1;

endendf o r ne =1: l e n g t h ( u)−11%16

i f ( ne<n t&&u ( ne)==44&&u ( ne+1)==65&&u ( ne +2)==7

83

Page 97: low cost frame work for parameter identification of unmanned aerial vehicles

8&&u ( ne+3)==49&&u ( ne +4)==58)q=u ( ne +5: ne +1 1 ) ;b r e a k% g16=ne −1;

endendf o r nd =1: l e n g t h ( u)−12%17

i f ( nd<n t&&u ( nd)==44&&u ( nd+1)==65&&u ( nd+2)==78&&u ( nd+3)==48&&u ( nd +4)==58)

p=u ( nd +5: nd +1 1 ) ;b r e a k% g17=nd −1;

endendf o r nc =1: l e n g t h ( u)−12%18

i f ( nc<n t&&u ( nc)==44&&u ( nc+1)==98&&u ( nc +2)==101&&u ( nc+3)==116&&u ( nc+4)==97&&u ( nc+5)==97&&u ( nc +6)==58)

b e t a a=u ( nc +7: nc +13 ) ;b r e a k% g18=nc −1;

endendf o r nc =1: l e n g t h ( u)−12%18

i f ( nc<n t&&u ( nc)==44&&u ( nc+1)==98&&u ( nc +2)==101&&u ( nc+3)==116&&u ( nc+4)==97&&u ( nc+5)==98&&u ( nc +6)==58)

b e t a b=u ( nc +7: nc +13 ) ;b r e a k% g18=nc −1;

endendf o r nb =1: l e n g t h ( u)−13%19

i f ( nb<n t&&u ( nb)==44&&u ( nb+1)==97&&u ( nb+2)==108&&u ( nb+3)==112&&u ( nb+4)==104&&u ( nb+5)==97&&u ( nb+6)==97&&u ( nb +7)==58)

a l p h a a=u ( nb +8: nb +1 4 ) ;b r e a k% g19=nb −1;

endendf o r nb =1: l e n g t h ( u)−13%19

i f ( nb<n t&&u ( nb)==44&&u ( nb+1)==97&&u ( nb+2)==108&&u ( nb+3)==112&&u ( nb+4)==104&&u ( nb+5)==97&&u ( nb+6)==98&&u ( nb +7)==58)

a l p h a b=u ( nb +8: nb +1 4 ) ;b r e a k% g19=nb −1;

84

Page 98: low cost frame work for parameter identification of unmanned aerial vehicles

endend% f o r na =1: l e n g t h ( u)−11%20% i f ( na<n t&&u ( na)==44&&u ( na+1)==114&&u ( na+2)==100&&u( na+3)==114&&u ( na +4)==58)

% r d r p o t =u ( na +5: na +11 ) ;% b r e a k% % g20=na −1;% end% end% f o r o i =1: l e n g t h ( u )−11 %21% i f ( oi <n t&&u ( o i )==44&&u ( o i +1)==97&&u ( o i +2)==105&&u (o i +3)==108&&u ( o i +4)==58)

%% a i l p o t =u ( o i +5: o i +11 ) ;% b r e a k% % g21=oi −1;% end% end% f o r nnu =1: l e n g t h ( u)−12%22% i f ( nnu<n t&&u ( nnu)==44&&u ( nnu+1)==101&&u ( nnu+2)==108&&u ( nnu+3)==101&&u ( nnu+4)==118&&u ( nnu +5)==58)%% e l e v p o t =u ( nnu +6: nnu +1 2 ) ;% b r e a k% % g22=nnu −1;% end% endf o r G=1: l e n g t h ( u)−16%23

i f (G<n t&&u (G)==44 && u (G+1)==118 && u (G+2)==101 && u (G+3)==108 && u (G+4)==111 && u (G+5)==99 && u (G+6)==105 && u (G+7)==116 && u (G+8)==

121 && u (G+9)==97&& u (G+10)==58)v e l o c i t y a =u (G+11:G+1 7 ) ;b r e a k% g23=G−1;

endendf o r GF=1: l e n g t h ( u)−16%23

i f (GF<n t&&u (GF)==44 && u (GF+1)==118 && u (GF+2)==101 && u (GF+3)==108 && u (GF+4)==111 && u (GF+5)==99 && u (GF+6)==105 && u (GF+7)==116 && u(GF+8)==121 && u (GF+9)==98&& u (GF+10)==58)

v e l o c i t y b =u (GF+11:GF+17 ) ;

85

Page 99: low cost frame work for parameter identification of unmanned aerial vehicles

b r e a k% g23=G−1;

endendf o r K=1: l e n g t h ( u )−16 %24

i f (K<n t&&u (K)==44 && u (K+1)==97 && u (K+2)==108 &&u (K+3)==116 && u (K+4)==105 && u (K+5)==116 &&

u (K+6)==117 && u (K+7)==100 && u (K+8)==101&& u (K+9)==58) %% a l t i t u d e

h e i g h t =u (K+10:K+16 ) ;b r e a k% g24=K−1;b r e a k

endendf o r J =1: l e n g t h ( u )−16 %25

i f ( J<n t&&u ( J )==44 && u ( J +1)==89 && u ( J +2)==65&& u ( J +3)==87 && u ( J +4)==58)

yaw=u ( J +5: J +1 1 ) ;b r e a k% g25=J −1;

endendf o r l =1: l e n g t h ( u )−16 %26

i f ( l <n t&&u ( l )==44 && u ( l +1)==80 && u ( l +2)==67 &&u ( l +3)==72 && u ( l +4)==58) %pch :

p i t c h =u ( l +5: l +1 1 ) ;b r e a k

endendr o l l =u ( 1 : 7 ) ;

86

Page 100: low cost frame work for parameter identification of unmanned aerial vehicles

∗∗∗∗∗∗∗

∗∗∗∗∗∗∗

The coming code i s t o t r a n s f e r each v a r i a b e l from a s c i i i n t oDecimal

f u n c t i o n SIGNAL = f c n ( u )SIGNAL=do ub l e ( 5 0 0 ) ;t = l e n g t h ( u ( 1 , 1 , : ) ) ;

l e f t =do ub l e ( 0 ) ;x = [ 0 ] ;%% Remove a l l d a t a a f t e r 4 4 .N=u ;f o r i =1: l e n g t h ( u )

i f u ( i )==44N=u ( 1 : i −1 ) ;b r e a k

endendu=N;%% Case I −ve wi th s i g ni f u ( 1 , 1 , t )==45 % i f −ve

f o r i =1: l e n g t h ( u )i f u ( i )==46

x (1 )= i ;b r e a k

endendi f x ( 1 ) ˜ = 0

% i f l e n g t h ( u ( 1 , : , t ) ) > x (1)+3% f o r j =x ( 1 ) + 1 : 1 : x (1)+3% n=1+n ;% r= r +( u ( 1 , j , t ) −48) ∗ ( 0 . 1 ) ˆ n ;% end% e l s en =0;r=do ub l e ( 0 ) ;f o r j =x ( 1 ) + 1 : 1 : l e n g t h ( u ( 1 , : , t ) )

n=1+n ;

r=do ub l e ( r +( u ( 1 , j , t ) −48) ∗ ( ( 0 . 1 ) ˆ n ) ) ;

87

Page 101: low cost frame work for parameter identification of unmanned aerial vehicles

end

n =0;f o r j =2: x ( 1 ) −1 ;

l e f t =do ub l e ( l e f t + ( u ( 1 , j , t ) −4 8 ) ∗ ( 1 0 ) ˆ ( x (1)−3−n ) ) ;n=n +1;

endSIGNAL=−1∗( r+ l e f t ) ;SIGNAL=do ub l e (SIGNAL ) ;

end%% Case I I −ve and no d e c i m a li f x (1)==0 % no d e c i m a l b u t s t i l l n e g a t i v e

n =0;

SIGNAL=do ub l e ( 0 ) ;f o r j = l e n g t h ( u ( 1 , : , t ) ) : −1 : 2

SIGNAL=do ub l e (SIGNAL+ ( u ( 1 , j , t ) −4 8 )∗ ( 1 0 ) ˆ n ) ;n=n +1;

endSIGNAL=do ub l e (−SIGNAL ) ;

end%% c a s e I I I p o s i t i v e wi th d e c i m a l

e l s e i f u ( 1 , 1 , t ) > 45 %% i f was +ve%%f o r i =1: l e n g t h ( u )

i f u ( i )==46x (1 )= i ;b r e a k ;

e l s ex ( 1 ) = 0 ;

endendi f x ( 1 ) ˜ = 0

n =0;r=do ub l e ( 0 ) ;f o r j =x ( 1 ) + 1 : l e n g t h ( u ( 1 , : , t ) )

n=1+n ;r=do ub l e ( r +( u ( 1 , j , t ) −48) ∗ ( . 1 ) ˆ n ) ;

end

n =0;l e f t =do ub l e ( 0 ) ;f o r j =1: x ( 1 ) −1 ;

l e f t =do ub l e ( l e f t + ( u ( 1 , j , t ) −4 8 ) ∗ ( 1 0 ) ˆ ( x (1)−2−n ) ) ;n=n +1;

88

Page 102: low cost frame work for parameter identification of unmanned aerial vehicles

endSIGNAL=( r+ l e f t ) ;SIGNAL=do ub l e (SIGNAL ) ;%% Case IV P o s i t i v e wi th no d e c i m a l

e l s e % no d e c i m a l b u t s t i l l p o s i t i v en =0;SIGNAL=do ub l e ( 0 ) ;f o r j = l e n g t h ( u ( 1 , : , t ) ) : −1 : 1

SIGNAL=do ub l e (SIGNAL+ ( u ( 1 , j , t ) −4 8 )∗ ( 1 0 ) ˆ n ) ;n=n +1;

endend

e l s eSIGNAL=do ub l e ( 5 0 0 ) ;

endend

89