executive summary - sheffield · executive summary . ... aims and objectives . the project is aimed...

77
I EXECUTIVE SUMMARY INTRODUCTION/BACKGROUND Ethology is a scientific and objective study which focuses on animal behaviour [1]. The behaviour of animal can be studied through observation in a controlled environment. Then it can be easy to determine how the animal behaviour is influenced by the stimuli from the environment. Recent developments in science automation indicate that machines can replace human to autonomously conduct scientific experiments [3]. Inspired by this idea, a method is proposed for this project that a computer is used to change certain conditions in the environment, and observe the reaction of the animal at the same time. Through certain system identification algorithms, the behaviour of the animal can be learned. No prior knowledge about the animal is required using this method. In this project, a programmed e-puck robot is used in this project to simulate an animal, so that the behaviour of the robot can be determined. Hence, the response of the robot to the stimuli in the environment is known. The computer used in this project is able to observe and learn the behaviour of the robot. In this way, comparisons can be made and the learning of the computer can be verified. AIMS AND OBJECTIVES The project is aimed to design, implement and study an experimental platform that enables system identification algorithms to learn about the behaviour of a physical agent (e.g., animals, robots) through controlled interaction. The objectives of the project are: a. To conduct a literature review (e.g., science automation, evolutionary robotics, motion tracking) b. To become familiar with the e-puck robotic system c. To design and make a computer controllable light source d. To verify experimentally the light source

Upload: nguyenmien

Post on 24-Apr-2018

224 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

I

EXECUTIVE SUMMARY

INTRODUCTION/BACKGROUND

Ethology is a scientific and objective study which focuses on animal behaviour [1].

The behaviour of animal can be studied through observation in a controlled

environment. Then it can be easy to determine how the animal behaviour is

influenced by the stimuli from the environment. Recent developments in science

automation indicate that machines can replace human to autonomously conduct

scientific experiments [3]. Inspired by this idea, a method is proposed for this project

that a computer is used to change certain conditions in the environment, and observe

the reaction of the animal at the same time. Through certain system identification

algorithms, the behaviour of the animal can be learned. No prior knowledge about the

animal is required using this method. In this project, a programmed e-puck robot is

used in this project to simulate an animal, so that the behaviour of the robot can be

determined. Hence, the response of the robot to the stimuli in the environment is

known. The computer used in this project is able to observe and learn the behaviour

of the robot. In this way, comparisons can be made and the learning of the computer

can be verified.

AIMS AND OBJECTIVES

The project is aimed to design, implement and study an experimental platform that

enables system identification algorithms to learn about the behaviour of a physical

agent (e.g., animals, robots) through controlled interaction.

The objectives of the project are:

a. To conduct a literature review (e.g., science automation, evolutionary robotics,

motion tracking)

b. To become familiar with the e-puck robotic system

c. To design and make a computer controllable light source

d. To verify experimentally the light source

Page 2: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

II

e. To programme the robot so that the robot can react to the light source similar to

an animal

f. To implement a vision-based motion tracking system

g. To verify experimentally the motion tracking system

h. To implement a system identification method that allows a machine to learn

about the robot through controlled interaction

i. To conduct experiments to verify the system identification method

j. To analyse the results and make improvements

ACHIEVEMENTS

First of all, a computer controllable light source was designed and implemented. Next,

the e-puck robot system was studied. An e-puck robot was programmed to perform a

circular motion with varying speed changes with the light intensity in the

environment. In addition, a visual-based motion tracking programmed was modified

to track the motion of the e-puck robot and estimate the speed. A series of tests were

conducted to verify the motion tracking system. At last, a basic system identification

algorithm was applied for the computer to learn the motion of the e-puck robot. The

result of the system identification was simply verified using Matlab. Overall, the

platform that enables system identification algorithms to learn about the behaviour of

a physical agent through controlled interaction has been successfully built and

utilized.

CONCLUSIONS / RECOMMENDATIONS

The aims and objectives of this project have been almost completely achieved.

Improvements can be made by increasing the accuracy of the motion tracking system,

adjusting design of the arena to make the light intensity more evenly distributed, and

verifying the system identification algorithm experimentally.

Page 3: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

III

ABSTRACT This project is based on the idea of the science automation, with the purpose of

designing, implementing, and learning an experimental platform that enables system

identification algorithms to learn about the behaviour of a physical agent, such as a

robot or an animal. In this project, a programmed e-puck robot was able to simulate

an animal to react with the stimuli in the environment. A computer controllable light

source was designed and implemented to provide such kind of stimuli. Moreover, a

computer was applied to control the light intensity of the environment an also track

the motion of the e-puck. Finally, a simple system identification algorithm was

carried out to learn the behaviour of the robot. The relationships between the speed of

the e-puck robot and the light intensity were estimated. The result of the system

identification was verified using Matlab. Due to the limited time for this project, the

current platform is not able to learn the behaviour of a real animal. However, when

this platform is improved gradually in the future, it may be applicable for a real

animal and offer convenience to the ethologists.

Page 4: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

IV

ACKNOWLEDGEMENTS I would like to thank my parents for their support and encouragement throughout my

study.

I would like to express my great appreciation to Dr Gross for his valuable and

constructive suggestions for my project.

I would also like to thank Mr Eastwood for his professional guidance and valuable

support and his willingness to give his time generously during implementing the

controllable light source.

I am particularly grateful for the assistance provided by the PHD students, Wei Li,

and Jianing Chen.

Finally, I wish to thank all my friends studying with me in the Information Commons.

Special thanks should be given to Mr Hao Zhu for his willingness to be my model for

explaining the function of the motion tracking programme.

Page 5: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

V

TABLE OF CONTENTS Chapter 1 – Introduction ................................................................................................ 1

1.1 Background and Motivation ............................................................................ 1 1.2 Problem Definition........................................................................................... 2 1.3 Aims and Objectives ........................................................................................ 2 1.4 Dissertation Overview ..................................................................................... 3 1.5 Project Management ........................................................................................ 4

Chapter 2 – Literature Review ....................................................................................... 6 2.1 The Automation of Science ............................................................................. 6 2.2 System Identification ....................................................................................... 8 2.3 Robotics ........................................................................................................... 9

Chapter 3 – Background Theory .................................................................................. 12 3.1 Differential Drive Kinematics ........................................................................ 12 3.2 Pulse Width Modulation (PWM) Control ...................................................... 13 3.3 Motion Tracking ............................................................................................ 14

3.3.1 Colour Model ...................................................................................... 14 3.3.2 Tracking of Coloured Object .............................................................. 16

3.4 Arduino Board Specifications ........................................................................ 20 3.5 E-puck Robot Specifications.......................................................................... 22

3.5.1 Technical Specifications ..................................................................... 22 3.5.2 IR Proximity Sensors .......................................................................... 23

3.6 Field Effect Transistor ................................................................................... 25 Chapter 4 - Design and Implementation ...................................................................... 27

4.1 Controllable Light Source .............................................................................. 27 4.1.1 Design of the Light Source ................................................................. 27 4.1.2 Programming for Controllable Light Source ...................................... 28 4.1.3 Implementation of the Light Source ................................................... 29

4.2 E-puck Robot for Simulation of an Animal ................................................... 32 4.2.1 Design of the Motion of the E-puck ................................................... 32 4.2.2 Test of the IR Sensors of the E-puck .................................................. 32 4.2.3 Test of the Motors of the E-puck ........................................................ 34 4.2.4 Programming of the E-puck ................................................................ 34

4.3 Arena .............................................................................................................. 36 4.3.1 Design of the Arena ............................................................................ 36 4.3.2 Construction of the Arena ................................................................... 36 4.3.3 Test of the Distribution of Light in the Arena .................................... 38 4.3.4 Determining of the Coefficients in the Programming of the E-puck .. 40

4.4 Motion Tracking ............................................................................................ 41 4.4.1 Adjustment of the Motion Tracking Programme ................................ 41 4.4.2 Test of the Coordinate System ............................................................ 42 4.4.3 Test of Motion Tracking ..................................................................... 44

4.5 System Identification ..................................................................................... 49 4.5.1 Results from the Experiment .............................................................. 49 4.5.2 One Basic System Identification Algorithm ....................................... 51 4.5.3 Analysis of the System Identification ................................................. 52 4.5.4 Other Attempt ..................................................................................... 53

Page 6: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

VI

Chapter 5 – Conclusions and Future Work .................................................................. 56 5.1 Evaluation of the Current Work ..................................................................... 56 5.2 Future Work ................................................................................................... 57

References .................................................................................................................... 58 Appendices ................................................................................................................... 63

A.1 Programme of the E-puck Robot .................................................................. 63 A.2 Programme for System Identification ........................................................... 66 A.3 Programme for Analysis of the Result of System Identification .................. 69

Page 7: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

VII

List of Figures

Figure 1 Gantt Chart for the Project. ............................................................................. 5 Figure 2 Differential Drive Kinematics, Reprinted from [19]. .................................... 12 Figure 3 Square Waves with 50 % and 25 % Duty Cycles, Adapted from [20]. ......... 13 Figure 4 Basic Additive Principles of the RBG Colours, Retrieved from [22]. .......... 14 Figure 5 RGB Colour Space, Retrieved from [23]. ..................................................... 15 Figure 6 HSL Colour Space from, Retrieved from [25]. ............................................. 15 Figure 7 Original Image of a Red Object. .................................................................... 17 Figure 8 Red Colour in the Original Image is Chosen................................................. 17 Figure 9 Grayscale Image Converted from the Original Image. ................................. 18 Figure 10 Result of Subtracting. .................................................................................. 18 Figure 11 Result of Median Filtering........................................................................... 18 Figure 12 Binary Image. .............................................................................................. 19 Figure 13 Binary Image after Removing Small Objects. ............................................. 19 Figure 14 Final Result of Coloured Object Tracking. ................................................. 19 Figure 15 Front of Arduino UNO Board, Reprinted from [28]. .................................. 20 Figure 16 PWM Control via Arduino Board, Adapted from [30] ............................... 21 Figure 17 Appearance of the E-puck, with Labels of Components, Retrieved from [31]. .............................................................................................................................. 22 Figure 18 Positions of the IR Proximity Sensors from Top View, Reprinted from [35]....................................................................................................................................... 23 Figure 19 Example of Raw Data Acquired from IR0, Reprinted from [35]. ............... 24 Figure 20 Basic Structure of an N-channel MOSFET, Reprinted from [36]. .............. 26 Figure 21 Sketch of the Circuit of the Controllable Light Source. .............................. 27 Figure 22 Result of the Test of the Arduino Board. .................................................... 29 Figure 23 Front of the Main Circuit of the Controllable Light Source. ....................... 30 Figure 24 Back of the Main Circuit of the Controllable Light Source. ....................... 31 Figure 25 Interface of Tiny Bootloader. ...................................................................... 33 Figure 26 Corner of the Arena. .................................................................................... 37 Figure 27 The Entire Arena. ........................................................................................ 38 Figure 28 Positions in the Arena for Measuring the Light Intensity. .......................... 39 Figure 29 Image of Distribution of the Light Intensity................................................ 40 Figure 30 Relationships between the Actual and Coordinate Distances. .................... 43 Figure 31 Comparison between the Theoretical Values and Those Estimated from the Programme. .................................................................................................................. 46 Figure 32 Estimation of Speed with Sampling Period of 0.2, 0.3, and 0.5 sec............ 47 Figure 33 Estimation of Speed with Sampling Period of 0.7, 1, 1.5 and 2 sec............ 48 Figure 34 Results from the Experiment. ...................................................................... 50 Figure 35 Relationship between y and x When Degree = 1. ....................................... 51 Figure 36 Relationship between y and x When Degree = 2. ....................................... 52 Figure 37 Comparison between Experimental Data and Result From System Identification. ............................................................................................................... 53 Figure 38 Raw Data from Experiment. ........................................................................ 54 Figure 39 Transfer Function Estimated from the Toolbox. ......................................... 54 Figure 40 Model Output Compared with Real Output. ............................................... 55

Page 8: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

VIII

List of Tables

Table 1 Time Management for Tasks in the Project. ..................................................... 4 Table 2 Specifications of Arduino UNO Board [28]. .................................................. 20 Table 3 Technical Specifications of the E-puck [32]. .................................................. 22 Table 4 Advantages and Disadvantages of Field-effect Transistors Comparing with the Conventional Transistors [36]. ............................................................................... 25 Table 5 Values of the Sum of the Light Intensity. ....................................................... 39 Table 6 Data from the Test of the Coordinate System................................................. 42 Table 7 Theoretical Speed Corresponding to the Speed Defined. ............................... 44 Table 8 Values of Speed Estimated by the Motion Tracking System. ........................ 45

Page 9: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

1

Chapter 1 – Introduction

1.1 Background and Motivation

Ethology, the scientific and objective study which focuses on animal behaviour, has

been developed for centuries [1]. Normally, the behaviour of animal is studied

through observations in specific environment which can be either uncontrolled or

controlled. If the environment is uncontrolled, the animal may react with many

different sorts of information from the environment at the same time. Then it can be

difficult to determine how animal behaviour is influenced by the stimuli from the

environment. On the contrary, if the environment is controlled, it may be much easier

to analyse the animal behaviour. However, the design of such a controllable

environment will become another problem [2]. This project is related to the study of

the behaviour of one animal in a controlled environment.

An example in science automation indicates that machines can replace human to

autonomously conduct scientific experiments [3]. Inspired by this idea, a method is

proposed for this project that a computer is used to automatically control the

environment by changing certain conditions in order to stimulate the animal. In

addition, the behaviour of the animal which is the reaction to the stimuli can also be

observed by the computer. Through certain system identification algorithms, the

behaviour of the animal can be learned. In this respect, no prior knowledge about the

animal is required using this method. However, considering the limitation of time, this

project will focus on the motivation of the animal. Moreover, the light intensity is the

only factor which can be considered as a changing variable of the environment.

Recent developments in robotics suggest that robots are widely used in science

research [4]. In this project, a robot can play a role as an animal. A programmed e-

puck robot is used in this project, so that the behaviour of the robot can be

determined. Hence, the response of the robot to the stimuli in the environment is

known. The computer used in this project is capable of observing and learning the

behaviour of the robot. In this way, comparisons can be made and the learning of the

Page 10: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

2

computer can be verified. When the whole system is gradually improved in the future,

is can be used for real animal behaviour analysis.

1.2 Problem Definition

Based on the theory and method stated above, the whole project can be divided into

four parts: controllable light source, programming of the e-puck robot, motion

tracking, and system identification respectively. The specifications of each part are:

a. The controllable light source is a simple embedded system which is connected

to a computer. Programming using C language is required for such an

embedded system. The controllable light source reads the information sent

from the computer and changes the light intensity correspondingly.

b. For the programming of the e-puck robot, the robot should be able to react to

the changing light intensity. In this project, to simplify the problem, the speed

of the robot is supposed to increase if the light intensity increases.

c. A camera is fixed upon the environment to observe the motion of the e-puck

robot. Image processing algorithm through MATLAB is used to track certain

coloured paper sticking to the robot.

d. Basic system identification algorithms are applied to analyse the speed of the

e-puck robot with the changing light intensity. Comparisons can be made

between the results from system identification and previous programming of

the robot.

1.3 Aims and Objectives

The project is aimed to design, implement and study an experimental platform that

enables system identification algorithms to learn about the behaviour of a physical

agent (e.g., animals, robots) through controlled interaction.

The objectives of the project are:

a. To conduct a literature review (e.g., science automation, evolutionary robotics,

motion tracking)

b. To become familiar with the e-puck robotic system

Page 11: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

3

c. To design and make a computer controllable light source

d. To experimentally verify the light source

e. To programme the robot so that the robot can react to the light source similar

to an animal

f. To implement a vision-based motion tracking system

g. To experimentally verify the motion tracking system

h. To implement a system identification method that allows a machine to learn

about the robot through controlled interaction

i. To conduct experiments to verify the system identification method

j. To analyse the results and make improvements

1.4 Dissertation Overview

This project is organized as follows:

Chapter 2: Literature Review, which shows the recent developments of science

automation, robotics, and system identification. It also explains the differences

between this project and previous applications.

Chapter 3: Background Theory, which explains the basic concepts of differential

drive kinematics, pulse width modulation (PWM) control, motion tracking etc.

Chapter 4: Design and Implementation, which elaborates the detailed design and

implementation of each part of the project. This chapter also gives descriptions of all

the experiments, presents and analyses the results from the experiments.

Chapter 5: Conclusion and Future Work, which discusses if the aims and objectives of

the project are achieved. In addition, this chapter will discuss the achievements and

drawbacks of the current progress, and the plan of the future work.

Page 12: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

4

1.5 Project Management

The project is divided into a series of tasks. The initial time management for each task

is shown in Table 1.

Table 1 Time Management for Tasks in the Project.

Task Name Duration Start Finish

1 Determine the Aim and Objectives of the Project 12 days Mon 13/05/13

Tue 28/05/13

2 Research Literature Materials 33 days Wed 29/05/13

Fri 12/07/13

3 Learn and Become Familiar with E-puck Robotics 3 days Wed 29/05/13

Fri 31/05/13

4 Design a Computer Controllable Light Source 5 days Mon 03/06/13

Fri 07/06/13

5 Purchase Equipment 10 days Mon 10/06/13

Fri 21/06/13

6 Programme the Robot 3 days Mon 10/06/13

Wed 12/06/13

7 Adjust the Programming of the Robot 2 days Thu 13/06/13

Fri 14/06/13

8 Learn Motion Tracking System 5 days Mon 17/06/13

Fri 21/06/13

9 Build the Computer Controllable Light Source 3 days Mon 24/06/13

Wed 26/06/13

10 Verify Experimentally the Light Source 2 days Thu 27/06/13

Fri 28/06/13

11 Implement a Vision-Based Motion Tracking System 3 days Mon

01/07/13 Wed

03/07/13

12 Verify Experimentally the Motion Tracking System 2 days Thu

04/07/13 Fri

05/07/13

13 Implement a System Identification Method 5 days Mon 08/07/13

Fri 12/07/13

14 Conduct Experiments to Verify the System Identification Method 5 days Mon

15/07/13 Fri

19/07/13

15 Analyze the Results and Make Adjustments 5 days Mon 22/07/13

Fri 26/07/13

16 Write the Dissertation 25 days Mon 29/07/13

Fri 30/08/13

The Gantt chart for this project is displayed in Figure 1.

Page 13: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

5

Figure 1 Gantt Chart for the Project.

During the actual implementation of this time management, one more week was spent

on building and verifying the controllable light source. However, the progress of the

following tasks was not affected by this.

Page 14: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

6

Chapter 2 – Literature Review

2.1 The Automation of Science

In modern society, science automation is playing a great role in a significant number

of fields, such as biology, chemistry, and pharmaceutics.

Ross D. King and his colleagues have introduced the basic concepts of the automation

of science in The Automation of Science. In that paper, King et al. consider that the

fundamental of science is the hypothetico-deductive approach and the detailed

recording of experiments to enable reproducibility [5]. With the development of

computers, their use for controlling, analysing and modelling has been involved in

science. King et al. give a clear explanation of the automation of science, as they

demonstrate:

This is a physically implemented laboratory automation system that

exploits techniques from the field of artificial intelligence to execute

cycles of scientific experiments. A robot scientist automatically

originates hypotheses to explain observations, devises experiments to test

these hypotheses, physically runs the experiments by using laboratory

robotics, interprets the results, and then repeats the cycle [5].

In addition, King et al. also illustrate the development of Robot Scientist Adam in [5].

Adam is an application of the science automation in the field of biology. Because of

the massive experimental data in biological systems, problems often occur when

acquiring of the data. Compared with the traditional manual methods, the robot

scientist methodology can be more efficient. Adam is able to autonomously create

functional genomics hypotheses about the yeast Saccharomyces cerevisiae and then

conduct experiments to verify the hypotheses [5]. The conclusions of Adam have

been confirmed by King et al. through manual experiments.

The high-throughput screening (HTS) is also an application of the automation of

science. Persidis has elaborated HTS in terms of historical perspective, current state,

Page 15: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

7

challenges, and future development in [6]. In the drug industry today, analysing a

large number of potential drug candidates through manual labour is simply impossible

due to the cost of a considerable amount of time, money, and manpower. Hence, the

automation of science is essential if the drug candidates are to be identified within a

reasonable timeframe and at acceptable cost. HTS can automatically analyse vast

quantities of potential drug candidates in a short time, which can free the researchers

from the repetitive experiments. Persidis states that original screening was once

applied in blood assays for blood components with limited efficiency in the 1970s;

however, nearly half a century later, a significant number of HTS approaches based

on entire cell line-based systems are developed, such as the receptor- based

amplification technology (R-SAT) developed by Acadia Pharmaceuticals (San Diego,

CA) [6]. Moreover, Persidis points out that a problem is often neglected that the key

biology reagents for conducting assays in adequate quantity and of qualified quality is

limited [6]. Furthermore, Persidis believes that the applications of HTS approaches

will continue to advance rapidly.

In [7], the science automation is involved in the field of system identification.

Macleod and his colleagues report that an imaging system used by the US government

was designed to identify marine zooplankton for detection of Deepwater Horizon oil

spill [7]. In addition, the Digital Automated Identification System (DAISY) was

developed to automatically identify biological species with high accuracy and fast

speed. Moreover, Macleod et al. [7] state that human error during the identification is

not avoidable. Sometimes people even make different choices from their own

previous selections. Thus the technologies of automated identification can be more

objective and accurate.

To summarise these applications of science automation, computers or robots are

involved in varying extents in different fields of science and engineering. Similar to

[7], system identification algorithms are also applied in this project. Nonetheless,

slight differences exist between [7] and this project. A computer is able to play a role

as a human researcher by automatically changing the light intensity of the

environment and tracking the motion of the robot in real time. Besides, because the

motion of the robot simulated is simple, no prior knowledge is needed for the

Page 16: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

8

computer to learn the behaviour of the robot. Hence, this project is a simple but new

trial of the automation of science.

2.2 System Identification

Now system identification is commonly used in both science and engineering.

Pintelon and Schoukens explain the basic concepts of system identification in [8]. In

short, system identification usually focuses on certain aspects in the nature at a time.

The rest aspects in the nature can be treated as the environment of the system. During

experiments, the interactions between the system and the environment are often

described by a set of input and output values. After a series of experiments, the salient

features of the system can be obtained as a result of the system identification process.

Gevers expresses his personal point of view on the development of the system

identification in [9]. A considerable amount of work on identification was performed

by the statisticians and econometricians. The statistical theory of parameter estimation

was based on the work of Gauss (1809) and Fisher (1912). However, it was developed

significantly between 1920 and 1970 [9]. Gevers [9] thinks that the publication of the

seminal papers of [10] and [11] in 1965 can be regarded as the birth of the

identification theory. These two papers set the stage for the following progress of the

two mainstream identification techniques today, subspace identification and

prediction-error identification respectively. What is more, Gevers [9] introduces some

other milestone papers as well, such as deterministic realization theory and the

maximum likelihood framework. The latter theory is utilised in this project. And it

will be further discussed in the next chapter.

In biology, coevolution means the changes of two biological objects influence each

other [12]. In system identification, a coevolutionary system often consists of two or

more populations, where populations can be evolved simultaneously [13]. In recent

decades, system identification through coevolutionary approaches is widely applied.

One example is the nonlinear system identification approach called the estimation-

exploration algorithm presented by Bongard and Lipson [13], which is comprised of

two populations: the estimation population and the exploration population. The

Page 17: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

9

estimation population evolves improvements to models. And the exploration

population evolves intelligent tests using the best model so far. As a result, the

amount of tests can be minimised. The other application is the coevolutionary

algorithm presented by Mirmomeni and Punch for prediction of chaotic dynamics

[14]. This algorithm consists of two populations: the model population and the

evaluation population, which are co-evolved to obtain an optimum model. This

approach can be used for time-varying problems.

The whole time for this project is about three months. The time may not be

sufficiently enough to conduct such a coevolutionary algorithm. Nonetheless, if the

time is enough, a classifier will be designed and implemented. In that way, the system

is comprised of the model structure and the classifier structure. The model is designed

to act as a “real” animal. And the classifier can interact with the environment by

changing the conditions of the environment and make judgement if the model is a

“real” animal. If the agent observed is judged as a model, the model structure will be

adjusted. If the classifier always makes wrong judgements, the fitness of the classifier

which depends only on the ability to distinguish the behaviour of the models from the

behaviour of the “real” animal should be adjusted. In this way, the structures can be

co-evolved.

2.3 Robotics

According to ISO 8387, a robot is an actuated mechanism which is able to be

programmed in two or more axes with a degree of autonomy, to move within its

environment, and to perform specific tasks [15]. Wang et al. discussed the autonomy

of a robot in [16]. They claim that an autonomous robot should have certain

characteristics. Firstly, it should be capable of sensing the changing of the

environment and make a proper reaction. Multi-sensor data fusion is an effective

approach when dealing with the sensor information. Other techniques such as least

square method, Bayesian method, and so on can also be used which have different

advantages. Secondly, the robot should be able to have the capability of self-

organization in such dynamic environment [16].

Page 18: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

10

Moreover, robots can be broadly divided into two types, industrial robots and service

robots respectively. Lecture notes from Dr. Tokhi describe the characteristics of these

two types of robots, as they demonstrate:

The industrial robot is automatically controlled, reprogrammable, multi-

purpose manipulator, programmable in three or more axes, which can be

either fixed in place or mobile for use in industrial automation

applications. The service robot is a robot that performs useful tasks for

humans or equipment excluding industrial automation applications [15].

There are also some other classifications especially for mobile robots, for example,

classification by mode of locomotion, or by if wheeled or legged. Now the varying

types of robots are widely applied in the field of industry, medicine, healthcare, space,

mining, education etc [15].

The e-puck robot used in this project belongs to the category of the service robot. The

e-puck robot a small mobile robot designed by École Polytechnique Fédérale de

Lausanne (EPFL, English: Swiss Federal Institute of Technology in Lausanne) in

Switzerland [4]. Now it has become one of the most widely-used educational robots

in the world [4]. The e-puck robot has a considerable number of advantages [17].

Firstly, the size of the robot is small, so that it is convenient to carry. Secondly, the

components of the robot are produced through mass production manufacturing

technology. The price of the robot is not expensive, which is 850 CHF (about 587

GBP) [18]. Thirdly, the mechanical structure of the robot is simple but robust.

Therefore, the users can easily understand the robot and make use of it. Fourthly, the

e-puck robot has good flexibility used due to its sensors. There are three sound

sensors, eight IR proximity sensors, a camera, and a 3D accelerometer in the e-puck

robot. Last but not least, the robot is user friendly because it is convenient to

programme the robot using computers. The specifications of the motors and IR

proximity sensors of the e-puck robot will be further discussed in the next chapter.

Due to the e-puck robot has so many advantages, it is highly regarded in the field of

education. Mondada et al. [4] state that the e-puck was originally applied in the

embedded programming course in 2005. The course was aimed to let the students

Page 19: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

11

understand and practice the embedded programming using C/C++ language. Students

attending the course were requested to evaluate if the application of the e-puck robot

illustrated the concepts of the course well. According to the investment, most of the

students considered that the e-puck robot is performing well for illustrating the

concepts of the course. The evaluation was getting better during the three years of e-

puck use from 2005. Especially in 2007, the e-puck robot was considered as an

excellent teaching tool by more than 90 per cent of the students.

Furthermore, Wang et al. express the concepts of evolutionary robotics in [16]. Early

robots were not able to sense the change of external environment. When there are

interferes with the pre-specified task, the robot may not perform accurately. Unlike

that, now evolutionary robotics use artificial evolution techniques to improve the

performance in the unpredictable external environment, such as multi-sensor data

fusion, least square method, Bayesian method, fuzzy logic etc. Wang et al. also

present the main strategies for robotic controller design and various applications of

artificial evolution in robotics in [16]. In this project, the e-puck robot works in an

environment with changing light intensity. The data obtained from the IR proximity

sensors is always interfered by noise. Hence, a problem exists that is how to reduce

the influence of noise on the sensors of the e-puck robot. The multi-sensor data fusion

algorithm can be used to solve this problem if time is sufficient.

Page 20: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

12

Chapter 3 – Background Theory

3.1 Differential Drive Kinematics

A large number of mobile robots use differential drive as their drive mechanism,

which contains two drive wheels mounted on a common axis. Each of the wheels can

be independently driven either forward or backward.

The velocity of each wheel can be determined to be varying so that the robot can

perform rolling motion. There should be a point locating along with the common left

and right wheel axis that the robot rotates about, which is namely the ICC -

Instantaneous Centre of Curvature, as shown in Figure 2 [19].

Figure 2 Differential Drive Kinematics, Reprinted from [19].

The trajectories of the robot can be changed through adjusting the velocities of the

two wheels. Because the angular velocity 𝜔 about the ICC must be the same for both

wheels, equations can be written as follows:

𝜔(𝑅 + 𝑙/2) = 𝑉𝑟 Equation 1

𝜔(𝑅 − 𝑙/2) = 𝑉𝑙 Equation 2

Page 21: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

13

where 𝑙 is the distance between the two wheels; 𝑉𝑟 and 𝑉𝑙 are the velocities of the

right and left wheel along the ground; 𝑅 is the distance from the ICC to the midpoint

between the wheels.

These equations can be transformed to solve 𝑅 and 𝜔 if the rest parameters are

known, as follows:

𝑅 =𝑙2∙𝑉𝑙 + 𝑉𝑟𝑉𝑟 − 𝑉𝑙

Equation 3

𝜔 =𝑉𝑟 − 𝑉𝑙𝑙

Equation 4

3.2 Pulse Width Modulation (PWM) Control

Pulse Width Modulation, or PWM, is a technique which uses a digital output from a

microcontroller to control analogue circuits [20]. The microcontroller creates a square

wave, which is a signal switched between on and off. The proportion of the output

signal which is high is called the Duty Cycle [20]. Examples of square waves with 50

% and 25 % duty cycles are illustrated in Figure 3. Briefly, he average value of

voltage (and current) fed to the load is controlled by changing the duty cycle. A

higher duty cycle of the control signal will lead to an increased power supplied to the

load.

Figure 3 Square Waves with 50 % and 25 % Duty Cycles, Adapted from [20].

Besides, the Modulation Frequency is the frequency of a cycle which depends on the

response time of a device. The modulation frequency must be fast to reduce the

response time and avoid twitchy behaviour of a controlled device [20].

On

Off

50 %duty cycle

25 %duty cycle

Page 22: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

14

PWM control can be applied in power control for devices, such as motors, light

sources, and servos. It can also be used in communications where the varied duty

cycle can represent specific data.

3.3 Motion Tracking

3.3.1 Colour Model

The RGB colour model is based on the theory that a broad array of colours can be

generated by adding the red, green, and blue light together in different ways [21]. It is

well-known and widely used in display devices, such as televisions or monitors of

computers. The name of the model comes from the initials of the three additive

primary colours, red, green, and blue. Basic additive principles of the three additive

primary colours can be explained as shown in Figure 4.

Figure 4 Basic Additive Principles of the RBG Colours, Retrieved from [22].

Normally, a one by three matrix can be used to represent a colour in RGB model. The

RGB colour space can be illustrated in Figure 5.

Page 23: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

15

Figure 5 RGB Colour Space, Retrieved from [23].

The other colour model well-known is the HSL colour model. HSL is short for hue,

saturation, and luminance (lightness, or brightness) [24]. It is commonly applied in

image analysis. The HSL colour space can be expressed in Figure 6.

Figure 6 HSL Colour Space from, Retrieved from [25].

Page 24: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

16

The HSL colour model can be treated as a cylindrical-coordinate representation of an

RGB colour model. Hence the RGB model and the HSL model can be converted

between each other.

There are also some other colour models commonly used, such as CMY, YIQ, YUV,

XYX etc [26]. In this project, only the RGB and HLS models are used for image

analysis.

3.3.2 Tracking of Coloured Object

A programme written using Matlab is found in [27]. This programme is able to track

all red colour objects and draw a bounding box around them. The fundamental

principles and procedure of this programme can be explained as follows:

a. A function is defined in Matlab to call the camera connected to the computer.

b. An image is obtained through the camera, and saved in the RGB colour model.

Here is an example image as shown in Figure 7. There is a red object in the

middle of the image, which is a key. In Matlab, this image is saved in a

480×640×3 matrix.

c. Only the red colour in the image is chosen which is illustrated in Figure 8.

Specifically, the image can be regarded as composition of three colour layers, red

green, and blue. Only the red colour layer is reserved. The rest is eliminated.

d. At the same time, the original image is converted to grayscale (see Figure 9).

e. The grayscale image is subtracted from the red colour image. The result is shown

in Figure 10. In this way, it is clear that the red colour zone is highlighted.

f. A median filter is used to deal with the image obtained from the last step (see

Figure 11). Median filtering is a nonlinear operation which is able to reduce “salt

and pepper” noise and preserve edges simultaneously. After filtering, each output

pixel contains the median value in the neighbourhood around the corresponding

pixel in the input image.

g. The image is converted from grayscale to binary as shown in Figure 12. Each

pixel is either white (1) or black (0). There is a level in the range [0, 1] during the

converting. This level defines the degree of brightness of a pixel which can be

converted to white.

Page 25: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

17

h. All small objects are removed from the binary image. In the example image (see

Figure 13), connected objects with fewer than 300 pixels are removed.

i. The region left in the image is detected and measured using the built-in

commands of Matlab. The bounding box can be drawn. In addition, the centroid

can be labelled with coordinates. The final result is shown in Figure 14.

j. The whole process is repeated until the user stops it. If the processing speed of

this programme is fast enough, the motion of the red coloured object can be

tracked by recording the coordinates.

Figure 7 Original Image of a Red Object.

Figure 8 Red Colour in the Original Image is Chosen.

Page 26: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

18

Figure 9 Grayscale Image Converted from the Original Image.

Figure 10 Result of Subtracting.

Figure 11 Result of Median Filtering.

Page 27: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

19

Figure 12 Binary Image.

Figure 13 Binary Image after Removing Small Objects.

Figure 14 Final Result of Coloured Object Tracking.

Page 28: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

20

This programme can be adjusted in order to track green or blue objects as well. In this

project, a piece of blue paper is adhered on the top of the e-puck robot for motion

tracking. Moreover, time is involved when adjusting this programme for speed

calculation.

3.4 Arduino Board Specifications

The Arduino UNO board (see Figure 15) is one of the most popular boards in

Arduino series.

Figure 15 Front of Arduino UNO Board, Reprinted from [28].

It is an integrated circuit board containing a microcontroller, digital/analogue

input/output pins, a ceramic resonator, a USB connection, a power jack, an ICSP

header, and a reset button. The specifications of the Arduino UNO board are

described in Table 2 as follows:

Table 2 Specifications of Arduino UNO Board [28].

Microcontroller ATmega328

Operating Voltage 5V

Input Voltage (Recommended) 7-12V

Page 29: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

21

Input Voltage (Limits) 6-20V

Digital I/O Pins 14 (of which 6 provide PWM output)

analogue Input Pins 6

DC Current per I/O Pin 40 mA

DC Current for 3.3V Pin 50 mA

Flash Memory 32 KB (ATmega328) of which 0.5 KB for bootloader

SRAM 2 KB (ATmega328)

EEPROM 1 KB (ATmega328)

Clock Speed 16 MHz

This board can be simply connected to a computer with a USB cable, and powered via

USB connection or an external power supply. The Arduino software [29] is used for

programming and uploading the programme to the board. The Arduino language is

based on C/C++. There are some sample programmes written in the Arduino website.

After programming, the Arduino board is able to communicate with a computer via

serial data.

The microcontroller of the Arduino UNO board is capable of creating a square wave

signal in order to conduct a PWM control. The Modulation Frequency is about

500Hz. The voltage of the output signal is either 5V (on) or 0V (off). There is a built-

in code, analogWrite(), which write an analogue value to a pin for performing PWM

control. The value of the PWM wave ranges from 0 to 255. For instance,

analogWrite(127) means a 50% duty cycle (see Figure 16).

Figure 16 PWM Control via Arduino Board, Adapted from [30]

Page 30: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

22

3.5 E-puck Robot Specifications

3.5.1 Technical Specifications

The appearance of the e-puck robot is shown in Figure 17, with components labelled.

Figure 17 Appearance of the E-puck, with Labels of Components, Retrieved

from [31].

The technical specifications of the e-puck robot are displayed in Table 3 as follows:

Table 3 Technical Specifications of the E-puck [32].

Features Technical Information

Size, weight Diameter: 70mm; height: 55mm; weight: 150g

Battery Autonomy 5Wh LiION battery, providing about 3 hours autonomy

Processor dsPIC 30F6014A @ 60 MHz

Motors

2 stepper motors (20 steps per revolution); 50:1 reduction gear;

maximum speed: about 1000 steps/s; the distance between the

wheels is about 53mm [33]

IR Sensors 8 infra-red sensors measuring ambient light and proximity of

objects up to about 6 cm

Camera VGA colour camera (maximum resolution of 480x640)

Microphones 3 omni-directional microphones capable of sound localization

Accelerometer 3D accelerometer along the x, y and z axes

LEDs 1 green and 8 red LEDs on the ring; 1 strong red LED in front

Speaker On-board speaker which can play WAV and tone sound

Page 31: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

23

Switch 16 position rotating switch on the top of the robot

Wireless Bluetooth for robot-computer and robot-robot wireless

communication

Remote control Infra-red receiver for standard remote control commands

Expansion Bus Large expansion bus designed to add new capabilities

Programming C programming (e.g. using MPLAB [34])

3.5.2 IR Proximity Sensors

The eight IR proximity sensors are located around the e-puck robot as illustrated in

Figure 18.

Figure 18 Positions of the IR Proximity Sensors from Top View, Reprinted from

[35].

The IR proximity sensor can estimate the proximity of an obstacle. First of all, it

measures the ambient light. Then it emits a beam of infrared light and measures the

sum of ambient light and reflected light. After subtracting, the reflected light can be

calculated, so that proximity can be estimated.

Page 32: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

24

Here is an example from [35], in which the robot starts against a wall and goes

backwards. The raw data obtained from IR0 (a front sensor) is expressed in Figure

19, where the x-axis shows the number of steps of distance from the wall; the y-axis

shows the raw data from the sensor IR0. 1000 steps correspond to 12.8cm. According

to Figure 19, the proximity value increases when the e-puck moves away from an

obstacle.

Figure 19 Example of Raw Data Acquired from IR0, Reprinted from [35].

However, in this project, measurement of proximity value is unnecessary temporarily.

Only the ambient light is measured.

There is unavoidable noise during the measurements of the IR proximity sensors. This

noise comes from the power supply of the e-puck, which increases when the battery is

going down [35].

Page 33: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

25

3.6 Field Effect Transistor

The field-effect transistor is a semiconductor device which is capable of controlling

the current by an electric field. There are two types of field-effect transistors, the

junction field-effect transistor (JFET, or simply FET) and the insulated-gate field

effect transistor (IGFET). The latter one is more commonly called the metal-oxide-

semiconductor transistor (MOST or MOSFET) [36]. Comparing with the

conventional transistors, the advantages and disadvantages of the field-effect

transistors are listed in Table 4:

Table 4 Advantages and Disadvantages of Field-effect Transistors Comparing

with the Conventional Transistors [36].

Advantages Disadvantages

Unipolar (The operation depends only on the flow of

the majority.) Relatively small gain

bandwidth Relatively immune to radiation

High input resistance For MOSFET, susceptible to

overload voltages (may be

damaged by static electricity

even during handling)

Less noisy than bipolar transistors

No offset voltage at zero drain current

Thermal stability

Basically, a transistor consists of three terminals, which are explained as follows:

Gate (G): the terminal offering p-n junction for conductivity.

Drain (D): the terminal through which the majority carriers leave the bar.

Source (S): the terminal through which the majority carriers enter the bar.

The field-effect transistor used in this project is an insulated-gate FET (MOSFET).

The technical specifications can be found in [37]. The basic structure of an n-channel

MOSFET is illustrated in Figure 20. A lightly doped p-type substrate in the bottom of

the transistor. Two high doped n+ sections are diffused into the p-type substrate. The

n+ regions works as source and drain. Besides, a layer of insulating silicon dioxide

(SiO2) is placed on the surface of this structure. Holes are cut in this layer so that the

Page 34: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

26

source and the drain can be contacted by outside circuit. Then the layer is covered

with the gate-metal.

Figure 20 Basic Structure of an N-channel MOSFET, Reprinted from [36].

The working principle of the FET is that it can affect the size and shape of the

"conductive channel", which is determined by the voltage applied across the gate and

source in order to control the flow of electrons from source to drain.

Page 35: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

27

Chapter 4 - Design and Implementation

This Chapter can be divided into five parts, which are with respect to controllable

light source, programming of the e-puck, arena, motion tracking, and system

identification respectively. Each different part has individual focus. The design,

implementation, and analysis of each part will be further discussed.

4.1 Controllable Light Source

4.1.1 Design of the Light Source

The basic ideas of the controllable light source are: an Arduino UNO board is

programmed to receive the serial data from a computer in real-time; the board will

conduct a PWM control to change the light intensity of a lamp; the light intensity of

the lamp depends on the serial data sent from the computer to the board.

A sketch of the circuit of the controllable light source is displayed in Figure 21.

Figure 21 Sketch of the Circuit of the Controllable Light Source.

Page 36: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

28

This circuit consists of an Arduino UNO board, a 100Ω resistor, a 100kΩ resistor, a

lamp, and a power supply. Pin 6 of the Arduino board is selected to create the control

signal. The the power supply used in this project is a DC power supply, of which the

output voltage ranges fom 0V to 12V. A 6V output voltage is selected, which is safe

and enough for illumination. The lamp chosen in this project is with a voltage rating

of 6.5V and a power rating of 2W. Hence the power supply of the lamp can be

basically met.

When implenmenting the controllable light source, in order that the distribution of the

light in the environment is as even as possible and the light intensity is high enough

for the e-puck to recognise, more than one lamp connected in parallel will be used. In

additon, serval lamp holders and wires are used during constructing such light source.

4.1.2 Programming for Controllable Light Source

The codes for the Arduino board are as follows:

int Pin = 6; // Pin 6 is used for PWM control

int lightIntensity = 0; // the initial light intensity is set to be 0

void setup()

Serial.begin(9600); // the serial communication is initialized

void loop()

while (Serial.available() > 0) // check if there is any serial data available

lightIntensity = Serial.read(); // the serial data is saved

analogWrite(Pin, lightIntensity); // conduct the PWM control

delay(30); // wait for 30ms

The software used in the computer to send serial data is Matlab. The reason why

Matlab is selected is that it will also be used for motion tracking and system

identification. Using the same software can make the whole process much easier.

Page 37: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

29

The core codes in Matlab with detailed explanations for sending the serial data are as

follows:

s = serial('COM5','BaudRate',9600);

% create serial port object namely “s” in port COM5, with transfer speed 9600bit/s

fopen(s);

% open the serial port object, s

fwrite(s, light_intensity, 'uint8', 'sync');

% the data, light_intensity, is sent synchronously to the object, s, in the form of uint8

fclose(s);

% close the serial port object, s

4.1.3 Implementation of the Light Source

Before connecting the components in the circuit, pin 6 and pin GND (ground) of the

Arduino UNO board were connected with an oscilloscope. An example programme

from the Arduino library, fading [38], was uploaded to the board to test if the Arduino

board works properly. The result of this test is show in Figure 22.

Figure 22 Result of the Test of the Arduino Board.

Next, all the components in the circuit were connected according to Figure 21 by

soldering. In this section, only one lamp was used. The main circuit (without lamps

Page 38: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

30

and power supply) is displayed in Figure 23 and Figure 24. Components in the

figures are labelled.

Figure 23 Front of the Main Circuit of the Controllable Light Source.

100 Ω Resistor

100k Ω Resistor

FET

Page 39: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

31

Figure 24 Back of the Main Circuit of the Controllable Light Source.

After that, the Arduino board was connected to a computer. The programme in 4.1.2

was uploaded to the Arduino board. Then the other programme was written in Matlab

to change the light intensity gradually. The light intensity was increased by 5 every

second. When it reached the maximum value 255, it would begin to decrease.

Conversely, when the light intensity became 0, it was turned to increase again. The

codes were displayed as follows:

clear;clc; light_intensity = 0; fade_value = 5; s = serial('COM5','BaudRate',9600); fopen(s); while(1) c = clock; light_intensity = light_intensity + fade_value; if (light_intensity == 255) fade_value = -5; elseif (light_intensity == 0) fade_value = 5;

G S D

Page 40: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

32

end disp(light_intensity) fwrite(s, light_intensity, 'uint8', 'sync'); pause(1); end

As a result, the light intensity of the lamp changed gradually as expected. The

implementation of the controllable light source was successful. In the following

experiments, slight adjustments could be made that the number of the lamps may be

increased according to the specific needs.

4.2 E-puck Robot for Simulation of an Animal

4.2.1 Design of the Motion of the E-puck

Basic types of motion can be linear motion or circular motion. However, during linear

motion, the problem that the e-puck encounters an obstacle like the wall of the arena

is unavoidable. Normally, an e-puck can be programmed to turn around and continue

to move. Then pauses will exist among the entire movement of the e-puck, which may

lead to extra problems when the motion of the e-puck is analysed. Therefore, circular

motion can be the most stable type of motion for an e-puck in an arena with limited

area. Especially when the radius of the circular motion is constant, in theory, the e-

puck will not reach the boundary of the arena.

According to the idea stated above, design of the motion of the e-puck was made that:

the e-puck performs a circular motion in the environment (an arena with limited area);

the radius of the circular motion is constant; the speed of the e-puck increases with the

light intensity in the environment. The constant radius of the circular motion can be

achieved by keeping the ratio of the speeds of the two wheels constant.

4.2.2 Test of the IR Sensors of the E-puck

Before programming the e-puck root, the function of the IR sensors of the e-puck

robot should be learned first. In other words, it should be made clear that how the

value of an IR sensor changes with the light intensity. Hence a simple programme was

Page 41: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

33

written in MPLAB and uploaded to the e-puck robot using Tiny Bootloader [39] via

bluetooth. The software Tiny Bootloader can be used to write programme to the

microcontroller of the e-puck robot and also receive data sent from the robot. The

core codes for an e-puck to send data to the computer are as follows:

sensor[0] = e_get_ambient_light(0); sprintf(buffer, "%d\r\n", sensor[0]); e_send_uart1_char(buffer, strlen(buffer));

In this way, the raw data from the sensor IR0 could be obtained from the e-puck. A

bright lamp was placed in different positions of the sensor IR0, from far to close. The

values of the sensor were displayed in Tiny Bootloader (see Figure 25).

Figure 25 Interface of Tiny Bootloader.

It turned out that when there was no lamp or the lamp was far from the sensor, the

value of the sensor was about 3968; when the lamp was placed closer to the sensor,

the value decreased, with an approximate minimum value of 88. Because there was no

filter in the programme in this test, the values of the sensor were affected by noise

seriously and so were floating. Nevertheless, it is clear that the value of an IR sensor

Page 42: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

34

of the e-puck robot decreases with the increase of the light intensity. In addition, the

values of the sensor are within the range from 0 to 4000. This kind of test would be

conducted again after the environment was built.

4.2.3 Test of the Motors of the E-puck

The speed of a motor in the e-puck ranges from 0 to 1000 step/s in two directions.

However, during the actual tests, when the speed of a wheel was determined too slow,

the wheel could not rotate properly, e.g. the wheel sometimes stopped for a while.

Individual differences existed in different e-puck robots. This problem may be due to

the equipment aging, like wearing and tearing of the gears in the e-puck, or the power

supplying to the step motor is not stable. In this way, a test was then conducted to find

the minimum speed for the motor of the e-puck robot to work properly.

In this test, a simple programme was written to define the step speed of each wheel of

the robot. The core codes are:

e_set_speed_left(left_speed); e_set_speed_right(right_speed);

The values of the speed of the wheels were set from 1 and increased manually. After

visual inspection, the minimum value for the motor to work properly was about

300step/s.

4.2.4 Programming of the E-puck

A programme was written base on the design in 4.2.1, which is attached in the

Appendices A.1. The programme has two functions. One of the functions is to control

the motion of the e-puck with reaction of the ambient light intensity. The other one is

to keep the robot static but send the data of the sensors or certain specific value to the

computer, which can be used for further test and debugging. The core codes of both

functions in this programme are similar, which are shown as follows:

sensor[0] = 4000 - e_get_ambient_light(0); sensor[1] = 4000 - e_get_ambient_light(1);

Page 43: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

35

sensor[2] = 4000 - e_get_ambient_light(2); sensor[3] = 4000 - e_get_ambient_light(3); sensor[4] = 4000 - e_get_ambient_light(4); sensor[5] = 4000 - e_get_ambient_light(5); sensor[6] = 4000 - e_get_ambient_light(6); sensor[7] = 4000 - e_get_ambient_light(7); intensity = 0; for (i=0;i<8;i++) intensity += sensor[i]; light_intensity = 0.5*intensity + 0.5*intensity0; intensity0 = intensity; speed = 300 + k*light_intensity; left_speed = 0.5*speed + 0.5*speed0; speed0 = speed; if (left_speed * r > 1000) left_speed = 1000/r; right_speed = r*left_speed; e_set_speed_left(left_speed); e_set_speed_right(right_speed);

The process of these codes is repeated. Firstly, the values of eight IR sensors are read

and subtracted from 4000 as the values of the light intensity. Then these values of

light intensity are added together. The speed of the left wheel of the robot is set

proportional to the sum of the light intensity. The speed of the right wheel is set to be

r (r > 1) times than the speed of the left wheel, leading to a circular motion of the

robot. Moreover, if the speed of the right wheel is larger than 1000step/s, it will be set

to be 1000step/s.

Among the codes, k and r are two key coefficients. k determines the extent that the

speed of the robot changes with the light intensity; and r determines the radius of the

circular motion. The coefficient k cannot be determined until the environment is built.

The value of r was chosen to be either 1.5 or 2. According to Equation 3 in 3.1, the

radius of the circular motion can be calculated.

When r is 1.5,

Page 44: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

36

𝑅 =532∙𝑉𝑙 + 1.5 × 𝑉𝑙1.5 × 𝑉𝑟 − 𝑉𝑙

= 132.5𝑚𝑚

When r is 2,

𝑅 =532∙𝑉𝑙 + 2 × 𝑉𝑙2 × 𝑉𝑟 − 𝑉𝑙

= 79.5𝑚𝑚

4.3 Arena

4.3.1 Design of the Arena

According to the radius of the circular motion calculated in 4.2.4, if the arena is a

square, the length of the side should be at least 2 × 132.5 + 70 = 335mm (sum of the

diameters of the circular motion and the e-puck) theoretically. Considering that the e-

puck robot may drift, the actual length of the side of the arena should be much larger.

Therefore, the basic ideas of the design of the arena are: a square arena is going to be

built; four lamps of the same type are placed in the four corners of the arena; a camera

is placed above the centre of the arena.

4.3.2 Construction of the Arena

On the basis of the ideas of the design of the arena, four white plastic boards were

used for building the arena. The dimensions of the plastic board are about 800mm ×

280mm. The plastic boards were jointed as shown in Figure 26. The lamps were

connected in parallel and stuck on the wall of the arena. Wires with different colours

were used for distinguishing the direction of the flowing current.

Page 45: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

37

Figure 26 Corner of the Arena.

The completed arena is illustrated in Figure 27. A camera is fixed above the centre of

the arena, supported by foamed plastics and paper boxes. The hight and position of

the camera could be adjusted easily. The inner dimensions of the arena is 760mm ×

760mm.

Page 46: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

38

Figure 27 The Entire Arena.

After the construction of the whole arena, including that the controllable light source

and the camera were connected with the computer, several blue markers (made of

paper) were placed in the arena. Then the camera was opened. The position of the

camera can be simply calibrated by checking the screenshots from the camera.

4.3.3 Test of the Distribution of Light in the Arena

An e-puck robot was programmed to keep sending the value of light intensity to the

computer. Then all the lamps in the arena were turned on with maximum light

intensity. In this case, the values of light intensity in different positions in the arena

could be measured. The positions in the arena can be illustrated by Figure 28.

Page 47: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

39

Figure 28 Positions in the Arena for Measuring the Light Intensity.

During the test, the direction of the e-puck robot was kept the same. The values of the

sum of the light intensity obtained from eight IR sensors in the e-puck robot are listed

in Table 5.

Table 5 Values of the Sum of the Light Intensity.

1 2 3 4 5

1 6720 4570 3440 4430 6555

2 4600 4180 3850 4267 4835

3 3750 3946 3765 3830 3780

4 3870 3783 3700 4015 4614

5 5410 4251 3689 4476 6330

In addition, when all the lamps were turned off, the value of the sum of the light

intensity was about 197.

Direction of the E-puck

Page 48: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

40

For more clearly explaining the distribution of the light intensity in the arena, a figure

is plotted using image() function in the Matlab, which is shown in Figure 29.

Figure 29 Image of Distribution of the Light Intensity.

Areas with high light intensity are represented by red colour. On the opposite, areas

with lower light intensity are represented by green colour. A conclusion can be made

from this figure that the light intensity in the middle area with dimensions of 390mm

× 390mm is almost evenly distributed.

4.3.4 Determining of the Coefficients in the Programming of the E-puck

This part is to determine the two coefficients, r and k in the programme of the e-puck

robot.

According to the calculations in 4.2.4 and the results in 4.3.3, when r = 1.5 and r = 2,

the radius of the circular motion of the e-puck are respectively 132.5mm and 79.5mm.

Supposed that the e-puck robot is always moving around the same centre and will not

drift, the region of motion of the e-puck will be within the middle area of the arena

1 2 3 4 5

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

5.5

Page 49: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

41

where the light intensity is evenly distributed. Hence, both r = 1.5 and r = 2 are

suitable. However, when r = 2, the varying range of the step speed of the left wheel

will be from 300 to 500 (maximum speed for right wheel is 1000step/s), which is

quite small. This situation will be against the analysis of the motion in the subsequent

tasks. Thus, r = 1.5 is selected in this project.

Considering the maximum step speed of the right wheel is 1000step/s, when r = 1.5,

the maximum step speed for the left wheel is about 666step/s. It is assumed that when

the light intensity in the arena reaches maximum, the speed for left wheel is set to be

about maximum as well. In this case, an equation can be used to approximately

determine the coefficient k, which is

300 + 𝑘 ∙ 𝐿𝑖𝑔ℎ𝑡𝐼𝑛𝑡𝑒𝑛𝑠𝑖𝑡𝑦𝑀𝑎𝑥 ≈ 666

According to Table 5, the maximum value of the light intensity in the middle area of

the arena is about 3800. Therefore, k = 0.1 is chosen.

4.4 Motion Tracking

4.4.1 Adjustment of the Motion Tracking Programme

A programme has been introduced in 3.3.2 which is capable of tracking coloured

objects. However, there is a drawback in this programme, which is that only the

coordinates of the object can be obtained. It is not possible to deal with the speed

analysis of the object by using the existing programme. Thus, time should be involved

in this programme for speed calculation.

In the new program, the coordinates of a blue object were recorded as well as the

time. In this way, both the time interval and the distance the object moved could be

calculated. If the time interval was small enough, the ratio of the distance and the time

interval could be regarded as the speed of the object. Such a time interval could be

treated as the sampling period of the motion tracking system. Initially, a sampling

period of 0.2sec was selected.

Page 50: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

42

4.4.2 Test of the Coordinate System

This test is aimed to find the relationship between the coordinate distance in the

motion tracking system and the actual distance. Two blue markers (made of paper)

were placed randomly in the arena. The actual distance was measured with a ruler.

Then the tracking program was performed to obtain the coordinates and calculate the

distance in the coordinate system. The distance with value 1 in the coordinate system

was regarded as 1 unit length (ul). Here the unit length, ul, was used as the unit of the

distance in the coordinate system. The value of the distance calculated using the

motion tracking programme was not constant. The decimal part of the value was

floating. This is due to the visual-based motion tracking system has a limited

accuracy. By comparing the varying coordinates obtained from the motion tracking

programme visually, the error is approximately within one unit length. Therefore, two

values of the distance were selected randomly each time. Then an average value was

used to reduce the error. The data acquired in this test is in Table 6.

Table 6 Data from the Test of the Coordinate System.

Trial No. Actual Distance

Measured /mm Coordinate Distance /ul

Average Coordinate

Distance /ul

1 243 322.1215

322.2105 322.2994

2 300 393.4191

393.3456 393.2721

3 150 195.3360

195.3645 195.3930

4 256 336.1152

336.4946 336.8739

5 298 392.3337

392.3890 392.4442

6 211 276.9326

276.8223 276.712

7 198 260.3795

260.4545 260.5295

Page 51: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

43

8 173 227.4322

227.3298 227.2274

It is believed that the when the actual distance is 0, the coordinate distance is also 0.

The relationships between the distance in the coordinate system and the actual

distance can be further explained in Figure 30.

Figure 30 Relationships between the Actual and Coordinate Distances.

In Figure 30, the x-axis shows the coordinate distance; the y-axis shows the actual

distance measured in mm. All the points are nearly in a straight line. A conclusion can

be made that the coordinate distance is proportional to the actual distance. In other

words, the coordinate system in the motion tracking programme is a scaling up of the

arena. A trend line is plotted in this figure which almost passes the point (0, 0). The

gradient of the trend line can be regarded as the ratio between the actual distance and

coordinate distance, which is 0.7592.

y = 0.7592x + 0.3911

0

50

100

150

200

250

300

0 50 100 150 200 250 300 350 400

Actu

al D

ista

nce

(mm

)

Coordinate Distance (ul)

Relationship between the Actual Distance and the Coordinate Distance

Page 52: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

44

4.4.3 Test of Motion Tracking

After the test of the coordinate system of the motion tracking programme, the ratio

between the actual distance and the coordinate distance was obtained, which is

0.7592. Then if the speed of the object was estimated by the motion tracking

programme, the actual speed could be calculated, which is 0.7592 times coordinate

speed. Then a series of tests were conducted to verify the motion tracking system.

During these tests, the e-puck robot performed a circular motion. The speed of the e-

puck robot was defined manually, of which the step speed of the right wheel was

defined to be 1.5 times than that of the left wheel. Then the theoretical speed of the e-

puck can be calculated using the equations in 3.1.

𝑆𝑝𝑒𝑒𝑑 = 𝜔 ∙ 𝑅 =𝑉𝑙 + 𝑉𝑟

2

As a displacement of 1000 steps is about 128mm, the values of the theoretical speed

corresponding to the values of speed defined are shown in Table 7.

Table 7 Theoretical Speed Corresponding to the Speed Defined.

No.

Step Speed of

Left Wheel

(step/s)

Step Speed of

Right Wheel

(step/s)

Theoretical

Speed (mm/s)

Coordinate

Speed (ul/s)

1 300 450 48 63.2

2 400 600 64 84.3

3 500 750 80 105.4

4 600 900 96 126.4

5 666 999 106.56 140

The sampling period of the motion tracking is approximately 0.2 sec. The data

acquired from the motion tracking system is displayed in Table 8.

Page 53: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

45

Table 8 Values of Speed Estimated by the Motion Tracking System.

No. Coordinate Speed Estimated (ul/s)

(300, 450) (400, 600) (500, 750) (600, 900) (666, 999)

1 72.2943 95.5025 116.8936 148.5949 155.0220

2 83.4641 107.9583 140.8698 175.3612 95.2271

3 43.3436 114.7580 147.5847 83.1779 191.5879

4 82.7872 58.4618 75.1820 181.7324 174.9955

5 81.1310 107.9514 127.5791 85.5854 89.1598

6 42.4232 55.7569 147.1466 163.3359 186.2869

7 78.9645 116.2209 64.1467 168.1167 180.7604

8 81.2022 108.7357 138.3106 81.3445 86.4026

9 40.5560 56.4986 147.3754 172.1937 177.4127

10 80.0397 114.4581 74.1681 170.3453 178.4270

Mean 68.6206 93.6302 117.9257 142.9788 151.5282

A comparison is made between the theoretical speed and the average speed estimated

from the motion tracking programme in Figure 31.

Page 54: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

46

Figure 31 Comparison between the Theoretical Values and Those Estimated

from the Programme.

Comparing with the values estimated from the motion tracking system, the theoretical

values of speed from calculation are generally smaller. Actually, when the e-puck was

performing a circular motion, for example, from point A to point B, the displacement

from A to B must be larger than the distance between these two points. Hence this

situation here is unexpected. There are two possible reasons for this situation. Firstly,

error exists during the positioning of the coloured points. In this test, the sampling

period of the tracking system is about 0.2 sec. Within such a small period, the e-puck

can only move a short distance. In this way, the distance calculation will be affected

by the error in positioning seriously. That is to say, the error of the tracking process is

enhanced. Secondly, the theoretical calculation of speed may not conform to the

actual situations. For instance, when the value of actual displacement of one step is

larger than that claimed from the technical specifications or after wearing and tearing

has happened to the mechanism of the robot, the speed of the robot may be faster than

expected. Evidence for this reason exists that the value of speed estimated from the

50

70

90

110

130

150

250 300 350 400 450 500 550 600 650 700

Coor

dina

te S

peed

(ul/

s)

Step Speed of the Left Wheel (step/s)

Coordinate Speed vs Step Speed of the Left Wheel

Theoretical Values Values from Motion Tracking System

Page 55: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

47

motion tracking system is close to the theoretical value; moreover, the growth of the

speed estimated is linear with an increasing step speed of the left wheel.

To find out whether the unexpected values of speed are due to the motion tracking

system or the e-puck robot, extended tests were conducted. Different sampling

periods in the motion tracking system were selected for speed estimation of an e-puck

robot. The value of speed was measured 50 times in each situation. The speed of the

e-puck was set to be constant, of which 300step/s for left wheel and 450step/s for

tight wheel. The corresponding theoretical value of speed in the coordinate system is

calculated in Table 7, which is 63.2ul/s. The results are shown in Figure 32 and

Figure 33.

Figure 32 Estimation of Speed with Sampling Period of 0.2, 0.3, and 0.5 sec.

0

10

20

30

40

50

60

70

80

90

100

0 10 20 30 40 50

Coor

dina

te S

peed

(ul/

s)

No. of Estimation

Estimation of Speed with Different Sampling Periods (0.2, 0.3, 0.5 sec)

t=0.2 t=0.3 t=0.5

Page 56: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

48

Figure 33 Estimation of Speed with Sampling Period of 0.7, 1, 1.5 and 2 sec.

Results can be obtained from the two figures as follows:

On the one hand, when the sampling period of the motion tracking system is small,

the error of the motion tracking system is significant, leading to a dramatically

varying value of speed estimated. On the other hand, with the increase of the

sampling period, the average speed estimated is clearly reducing. This is because of

the algorithm of speed calculation mentioned before. In addition, the average value of

speed estimated in each situation is larger than the theoretical value. It turned out that

the unexpected value is not due to the error of the motion tracking system, but rather

the e-puck robot.

Therefore, for the purpose of balance, a sampling period of 0.5 second was chosen for

the motion tracking system. This sampling period would be also used for the system

identification part.

0

10

20

30

40

50

60

70

80

90

100

0 10 20 30 40 50

Coor

dina

te S

peed

(ul/

s)

No. of Estimation

Estimation of Speed with Various Sampling Periods (0.7, 1, 1.5, 2 sec)

t=0.7 t=1 t=1.5 t=2

Page 57: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

49

4.5 System Identification

This part could be treated as a combination of all the experiments conducted before.

At first, a controllable light source was designed and implemented. Then a

programme was written for the e-puck robot. As the coefficients of the programme in

4.2.4 were determined, the e-puck robot was able to perform a circular motion in the

middle region of the arena. The motion tracking system was also built and verified

experimentally. Hence the whole experimental platform was ready. System

identification algorithms could be used for the computer to learn about the motion of

the e-puck.

4.5.1 Results from the Experiment

The process of the experiment is basically the same as what is mentioned in 1.2. The

e-puck robot was placed in the arena and performing a circular motion. The speed of

the robot changed according to the light intensity in the arena. A computer was used

to change the light intensity in the arena gradually and record the speed of the e-puck

via the motion tracking system. The PWM value for controlling the light intensity of

the lamps was increased by 10 every time. Under each condition with certain light

intensity, the speed of the e-puck robot was measured 30 times. The sampling time of

the motion tracking system was 0.5 sec. The Matlab codes are shown in Appendices

A.2 for this experiment. Because in this experiment, the e-puck robot could drift in

the arena sometimes, the experiment was repeated several times. If the e-puck drifted,

outliers would exist in the results. The results with the least outliers are shown in

Figure 34.

Page 58: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

50

Figure 34 Results from the Experiment.

This figure was plotted by the boxplot() function in Matlab. This function is capable

of producing a box plot of the data. On each box, the central mark is the median, the

edges of the box are the 25th and 75th percentiles, the whiskers extend to the most

extreme data points not considered outliers, and outliers are plotted individually.

There is only one outlier in this figure, which is (0, 0). This is caused by the motion

tracking programme. When the programme gets the first snapshot of the object, the

speed of the object is set to be 0. According to Figure 34, the speed of the e-puck

robot has a clear trend that it increases with PWM value.

0

20

40

60

80

100

120

140

160

180

0 10 20 30 40 50 60 70 80 90 100 110 120 130 140 150 160 170 180 190 200 210 220 230 240 250PWM Values to Control Light Intensity

Speed vs PWM Value

Coo

rdin

ate

Spe

ed (u

l/s)

Page 59: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

51

4.5.2 One Basic System Identification Algorithm

A function polyfit() in Matlab was used to simply find the relationship between the

speed of the e-puck and the PWM value for the lamps. This function is able to find

the coefficients of a polynomial p(x) of degree n,

𝑝(𝑥) = 𝑝1𝑥𝑛 + 𝑝2𝑥𝑛−1 + ⋯+ 𝑝𝑛𝑥 + 𝑝𝑛+1

Let coordinate speed of the e-puck robot = y, and the PWM value = x. When the

degree of the polynomial y(x) is 1, after using the function polyfit(), the equation of

the polynomial y(x) can be written as

𝑦 = 0.3123𝑥 + 62.7555

In this case, the relationship between the speed of the e-puck and the PWM value is

linear (see Figure 35).

Figure 35 Relationship between y and x When Degree = 1.

0 50 100 150 200 25060

70

80

90

100

110

120

130

140

150

Page 60: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

52

When the degree is 2, the equation of the polynomial y(x) is estimated as

𝑦 = 0.0009𝑥2 + 0.0872𝑥 + 71.7594

The result is plotted in Figure 36.

Figure 36 Relationship between y and x When Degree = 2.

Comparing with Figure 34, the latter equation is closer to the results from the

experiment.

4.5.3 Analysis of the System Identification

Because one PWM value corresponds to 30 values of speed estimated from the

motion tracking system, the result of system identification cannot be compared with

the experimental data directly. However, average value of speed can be calculated to

approximately represent the speed of the e-puck in the environment with certain light

intensity. Then a programme for this was written (see in Appendices A.3). A

0 50 100 150 200 25070

80

90

100

110

120

130

140

150

Page 61: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

53

comparison was made between the result of the experiment and the model estimated

from system identification algorithm in Figure 37.

Figure 37 Comparison between Experimental Data and Result From System

Identification.

In Figure 37, the red curve shows the model estimated from the system identification

algorithm; and the black dashed curve represents the average value from experiment.

A conclusion can be made that the model estimated from the system identification

algorithm is close to the data obtained from the experiment. Besides, compared with

the initial programming of the e-puck robot and the tests in 4.4.3, the learning result

of the system identification basically meets what is expected.

4.5.4 Other Attempt

This part presents a simply attempt of using the system identification toolbox in

Matlab to deal with the data from this experiment.

0 50 100 150 200 25070

80

90

100

110

120

130

140

150

PWM Value

Coo

rdin

ate

Spe

ed (u

l/s)

Comparasion between Experimental Data and Model from System Identification

Data from ExperimentModel From System Identification

Page 62: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

54

Firstly, the data was imported to the toolbox. The input and output data was plotted

with time, as shown in Figure 38. The time interval was selected as 0.5s.

Figure 38 Raw Data from Experiment.

Then a transfer function model could be easily estimated using the toolbox. A transfer

function with 1 zero and 3 poles was found as shown in Figure 39.

Figure 39 Transfer Function Estimated from the Toolbox.

Page 63: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

55

The model output was plotted camparing with the real output as shown in Figure 40.

Figure 40 Model Output Compared with Real Output.

This toolbox was also able to estimate other models, such as state space models and

none-linear models. However, for this project, the whole system has only one input

and one output. It is more convenient to apply a simply system identification

algorithm like the one in 4.5.3 to learn the behavior of the robot than using this

toolbox. Conversely, if the system to deal with is complicated, this toolbox can be a

brilliant choice.

Page 64: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

56

Chapter 5 – Conclusions and Future Work

5.1 Evaluation of the Current Work

In this project, a computer controllable light source was designed and implemented.

An Arduino board was used during the implementation of the controllable light

source. Moreover, the e-puck robot system was studied. Especially the function of the

IR sensors was tested. Then an e-puck robot was programmed to perform a circular

motion with varying speed changes with the light intensity in the environment. The

coefficients in the programme were determined after the arena was built. In addition,

a visual-based motion tracking programmed was modified to track the motion of the

e-puck robot and estimate the speed. A series of tests were conducted to verify the

motion tracking system. After that, a sampling period of 0.5s for the motion tracking

system was selected. At last, a basic system identification algorithm was applied for

the computer to learn the motion of the e-puck robot. The relationships between the

speed of the e-puck robot and the PWM value for controlling the light intensity were

estimated and presented in an equation. The result of the system identification was

simply verified using Matlab. Overall, the aims and objectives of this project have

been almost completely achieved. The platform that enables system identification

algorithms to learn about the behaviour of a physical agent (e.g., animals, robots)

through controlled interaction has been successfully built and utilized.

There are some drawbacks in the project. Firstly, the visual-based motion tracking

system has limited accuracy. When the coordinates of a coloured object was

measured, there was always obvious error. Secondly, the light intensity was not

evenly distributed in the entire arena. When the e-puck drifted in the arena, it could

move to the region where the light intensity was unevenly distributed. The result of

the experiment could be affected seriously. Thirdly, the system identification

algorithm was just verified simply using Matlab rather than being verified

experimentally.

Page 65: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

57

5.2 Future Work

Due to the limited time of this project, there are several ideas which have not been

realised. If there is more time for the project, the future work is planned as follows:

Firstly, a computer controllable sound source will be designed and implemented in the

arena. Then the animal will react to both the light intensity and the sound. This will

make the whole system more complicated and interesting.

Secondly, when the varying factors in the environment become multiple, the system

identification algorithm used in this project will be no longer applicable, because the

algorithm can only be used to deal with the system with single input and single

output. Hence in that case, the system identification toolbox in Matlab will be further

studied and used.

Thirdly, a coevolutionary algorithm will be applied. As has discussed in 2.2, a

classifier will be designed and implemented. Then the classifier can control the

conditions in the environment and distinguish if the robot performs like an animal.

The fitness of the classifier can be adjusted to improve its ability of judgement. In this

way, the whole system can be co-evolved.

Page 66: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

58

REFERENCES [1] J.J. Bolhuis and L.-A. Giraldeau, “The Behavior of Animals: Mechanisms,

Function, and Evolution,” Wiley-Blackwell, USA, 2004.

[2] W. Li, M. Gauci, and R. Groß, “A Coevolutionary Approach to Learn Animal

Behavior Through Controlled Interaction,” Proc. of the Genetic and Evolutionary

Computation Conf., GECCO 2013.

[3] J. Evans and A. Rzhetsky, “Machine Science,” Science, vol. 329, no. 5990, pp.

399–400, 2010.

[4] F. Mondada et al, “The E-puck, a Robot Designed for Education in Engineering,”

2009. [Online]. Available: http://81.180.214.82/aE/Elab/Lab4/epuck-robotica2009.pdf.

[Accessed: 12 August 2013].

[5] R. D. King et al, “The automation of science,” Science, vol. 324, no. 5923, pp. 85–

89, 2009.

[6] A. Persidis, “High-throughput screening,” Nature Biotechnology, vol. 16, no. 5, pp.

488–493, 1998.

[7] N. MacLeod, M. Benfield, and P. Culverhouse, “Time to Automate Identification,”

Nature, vol. 467, no. 7312, pp. 154–55, 2010.

[8] R. Pintelon and J. Schoukens, “An Introduction to Identification,” in System

Identification, 2nd ed., Hoboken, N.J.: John Wiley & Sons Inc., 2012, pp. 1-2.

[9] M. Gevers, “A Personal View of the Development of System Identification: A 30-

Year Journey through an Exciting Field,” IEEE Trans. Control Systems, vol. 26, no. 6,

pp. 93-105, Dec. 2006.

Page 67: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

59

[10] B.L. Ho and R.E. Kalman, “Eective Construction of Linear State-variable

Models from Input-output Functions,” Regelungstechnik, vol. 12, pp. 545–548, 1965.

[11] K.J. Åstr¨om and T. Bohlin, “Numerical Identification of Linear Dynamic

Systems from Normal Operating Records,” in Proc. IFAC Symp. Self-Adaptive

Systems, Teddington, U.K., 1965, pp. 96–111.

[12] D.R. Brooks, “Testing the Context and Extent of Host-parasite Coevolution,”

Systematic Biology, vol. 28, no. 3, pp. 299-307, Apr. 1979.

[13] J.C. Bongard and H. Lipson, “Automated Robot Function Recovery after

Unanticipated Failure or Environmental Change Using a Minimum of Hardware

Trials,” In Proceedings of the 2004 NASA/DoD Conference on Evolvable Hardware,

pp. 169–176. IEEE Computer Society, 2004.

[14] M. Mirmomeni and W. Punch, “Co-evolving Data Driven Models and Test Data

Sets with the Application to Forecast Chaotic Time Series,” In 2011 IEEE Congress

on Evolutionary Computation, New Orleans, LA: Auburn University, 2011, pp. 14–

20.

[15] M.O. Tokhi, “Lecture 1 Introduction,” lecture notes of ACS 6118 Robotics and

Multi-sensor Systems, pp. 2-4, 2013.

[16] L. Wang, K.N. Tan, C.M. Chew, “Artificial Evolution Based Autonomous Robot

Navigation,” in Evolutionary Robotics: from Algorithms to Implementations,

Hackensack, N.J.: World Scientific Pub., 2006, pp. 1-24.

[17] E-puck Website. [Online]. Available: http://www.e-puck.org/. [Accessed: 18

August 2013].

[18] GCtronic. [Online]. Available: http://www.gctronic.com/e-puck.php. [Accessed:

18 August 2013].

Page 68: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

60

[19] G. Dudek and M. Jenkin, “Locomotion,” in Computational Principles of Mobile

Robotics, 2nd ed., New York: Cambridge University Press, 2010, pp. 36-48.

[20] S.A. Pope, “The Digital World and Its Interface to the ‘Real’ World,” lecture

notes of ACS 6110 Embedded Systems, pp.31-32, 2013.

[21] ChaosPro, “RGB Color Model,” 2011. [Online]. Available:

http://www.chaospro.de/documentation/html/paletteeditor/colorspace_rgb.htm.

[Accessed: 19 August 2013].

[22] Wikipedia, 6 April 2007. [Online]. Available:

http://en.wikipedia.org/wiki/File:AdditiveColor.svg. [Accessed: 19 August 2013].

[23] Code Project, “An HSV/RGBA Colour Picker,” 4 April 2006. [Online].

Available: http://www.codeproject.com/Articles/9207/An-HSV-RGBA-colour-picker.

[Accessed: 19 August 2013].

[24] ChaosPro, “HSL Color Model,” 2011. [Online]. Available:

http://www.chaospro.de/documentation/html/paletteeditor/colorspace_hsl.htm.

[Accessed: 19 August 2013].

[25] Code Project, “The Known Colors Palette Tool – Revised,” 24 September 2011.

[Online]. Available:

http://www.codeproject.com/Articles/243610/The-Known-Colors-Palette-Tool-

Revised. [Accessed: 19 August 2013].

[26] T. Gevers, A.W.M. Smeulders, “Color-based Object Recognition,” Pattern

Recognition, vol. 32, no. 3, pp. 453-464, March 1999.

[27] A.B. Anand, “Tracking Red Color Objects Using Matlab,” 18 September 2010.

[Online]. Available: http://www.mathworks.co.uk/matlabcentral/fileexchange/28757-

tracking-red-color-objects-using-matlab. [Accessed: 19 August 2013].

Page 69: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

61

[28] Arduino. [Online]. Available: http://arduino.cc/en/Main/ArduinoBoardUno.

[Accessed: 20 August 2013].

[29] Arduino. [Online]. Available: http://arduino.cc/en/Main/Software. [Accessed: 20

August 2013].

[30] T. Hirzel. [Online]. Available: http://arduino.cc/en/Tutorial/PWM. [Accessed: 20

August 2013].

[31] Cyberbotics, E-puck Brochure. [Online]. Available:

http://www.cyberbotics.com/e-puck/e-puck.pdf. [Accessed: 21 August 2013].

[32] Cyberbotics, 21 August 2013. [Online]. Available:

http://www.cyberbotics.com/e-puck/. [Accessed: 21 August 2013].

[33] E-puck, 24 November 2010. [Online]. Available: http://www.e-

puck.org/index.php?option=com_content&view=article&id=7&Itemid=9 [Accessed:

21 August 2013].

[34] MPLAB IDE Version 8.89.00.00, with C30 C compiler.

[35] E-puck, 24 November 2010. [Online]. Available: http://www.e-

puck.org/index.php?option=com_content&view=article&id=22&Itemid=13.

[Accessed: 21 August 2013].

[36] J. Millman, “Field-effect Transistors,” Electronic Devices and Circuits, USA:

McGraw-Hill, pp. 384-417, 1967.

[37] Datasheet Catalog, “Datasheet of STB36NF06L/STP36NF06L”. [Online].

Available:

http://pdf.datasheetcatalog.com/datasheet2/8/0iyt26iogeofcyaowtrx4z7iwiky.pdf.

[Accessed: 21 August 2013].

Page 70: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

62

[38] Arduino, Fading. [Online]. Available: http://arduino.cc/en/Tutorial/Fading.

[Accessed: 22 August 2013].

[39] Tiny PIC Bootloader, version 1.9.8.

Page 71: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

63

APPENDICES

A.1 Programme of the E-puck Robot

#include "p30f6014A.h" #include "stdio.h" #include "string.h" #include "uart/e_uart_char.h" #include "motor_led/e_init_port.h" #include "motor_led/e_epuck_ports.h" #include "motor_led/advance_one_timer/e_motors.h" #include "motor_led/advance_one_timer/e_led.h" #include "motor_led/advance_one_timer/e_agenda.h" #include "motor_led/advance_one_timer/e_remote_control.h" #include "uart/e_uart_char.h" #include "a_d/advance_ad_scan/e_ad_conv.h" #include "a_d/advance_ad_scan/e_prox.h" /////////////////////////////////////////////////////////////// /////////////////////////////////////////////////////////////// #define DELAY 50000 #define TV_START 12 #define TV_ADDRESS 17476 #define TV_OK 53 /////////////////////////////////////////////////////////////// /////////////////////////////////////////////////////////////// void delay(long int time); void run_animat(); void run_data(); int getselector(); /////////////////////////////////////////////////////////////// /////////////////////////////////////////////////////////////// void delay(long int time) long int i,j; for(i=0;i<time;i++) for(j=0;j<DELAY;j++); /////////////////////////////////////////////////////////////// /////////////////////////////////////////////////////////////// int getselector() return SELECTOR0 + 2*SELECTOR1 + 4*SELECTOR2 + 8*SELECTOR3; /////////////////////////////////////////////////////////////// /////////////////////////////////////////////////////////////// void wait_for_bluetooth(void) //Short delay to ensure that programming is possible long static start_delay; char static start_blink; for (start_blink=0; start_blink<8; start_blink++) for (start_delay=0; start_delay<250000; start_delay++); e_set_led(8, 1);

Page 72: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

64

e_set_led(8, 0); /////////////////////////////////////////////////////////////// /////////////////////////////////////////////////////////////// int left_speed = 0; int right_speed = 0; /////////////////////////////////////////////////////////////// /////////////////////////////////////////////////////////////// int main() char buffer[80]; int selector; int ir_check; int previous_check = 0; //system initialization e_init_port(); e_init_uart1(); e_init_motors(); e_init_ad_scan(ALL_ADC); e_init_remote_control(); wait_for_bluetooth(); //Short delay to ensure programming //Reset if Power on (some problem for few robots) if (RCONbits.POR) RCONbits.POR=0; __asm__ volatile ("reset"); // Wait for command from TV remote controller to start //while(e_get_data()!=TV_START&&e_get_address()!=TV_ADDRESS); // Decide upon program selector=getselector(); e_start_agendas_processing(); sprintf(buffer, "Starting with selector pos %d\r\n", selector); e_send_uart1_char(buffer, strlen(buffer)); switch(selector) case 0: e_activate_agenda(run_animat,2000); break; case 1: e_activate_agenda(run_data,2000); break; default: break; while(1) // ir_check = e_get_check(); // Wait for command from TV remote controller to end // if(ir_check != previous_check) // previous_check = ir_check; // if(e_get_data()==TV_OK) // break; //

Page 73: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

65

return 0; void run_animat() static int sensor[8]=0,0,0,0,0,0,0,0; static int light_intensity = 0; static int intensity = 0; static int intensity0 = 0; static int speed = 0; static int speed0 = 0; int i;

float k=0.1; float r=1.5;

sensor[0] = 4000 - e_get_ambient_light(0); sensor[1] = 4000 - e_get_ambient_light(1); sensor[2] = 4000 - e_get_ambient_light(2); sensor[3] = 4000 - e_get_ambient_light(3); sensor[4] = 4000 - e_get_ambient_light(4); sensor[5] = 4000 - e_get_ambient_light(5); sensor[6] = 4000 - e_get_ambient_light(6); sensor[7] = 4000 - e_get_ambient_light(7); intensity = 0; for (i=0;i<8;i++) intensity += sensor[i]; light_intensity = 0.5*intensity + 0.5*intensity0; intensity0 = intensity; speed = 300 + k*light_intensity; left_speed = 0.5*speed + 0.5*speed0; speed0 = speed; if (left_speed > 666) left_speed = 666; right_speed = r*left_speed; e_set_speed_left(left_speed); e_set_speed_right(right_speed); void run_data() char buffer[80]; static int sensor[8]=0,0,0,0,0,0,0,0; static int light_intensity = 0; static int intensity = 0; static int intensity0 = 0; static int speed = 0; static int speed0 = 0; int i;

float k=0.1; float r=1.5;

sensor[0] = 4000 - e_get_ambient_light(0); sensor[1] = 4000 - e_get_ambient_light(1);

Page 74: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

66

sensor[2] = 4000 - e_get_ambient_light(2); sensor[3] = 4000 - e_get_ambient_light(3); sensor[4] = 4000 - e_get_ambient_light(4); sensor[5] = 4000 - e_get_ambient_light(5); sensor[6] = 4000 - e_get_ambient_light(6); sensor[7] = 4000 - e_get_ambient_light(7); intensity = 0; for (i=0;i<8;i++) intensity += sensor[i]; light_intensity = 0.5*intensity + 0.5*intensity0; intensity0 = intensity; speed = 300 + k*light_intensity; left_speed = 0.5*speed + 0.5*speed0; speed0 = speed; if (left_speed > 666) left_speed = 666; right_speed = r*left_speed; //e_set_speed_left(left_speed); //e_set_speed_right(right_speed); sprintf(buffer, "%d\r\n", light_intensity); e_send_uart1_char(buffer, strlen(buffer));

A.2 Programme for System Identification

%% Initialization of the Parameters clear; clc; x = -1; y = -1; light_intensity = 0; speed = 0; distance = 0; secend_hand = -1; fade_value = 10; IP = []; OP = []; s = serial('COM5','BaudRate',9600); fopen(s); fwrite(s, light_intensity, 'uint8', 'sync') %% Preparation of the Camera a = imaqhwinfo; [camera_name, camera_id, format] = getCameraInfo(a); mt = videoinput(camera_name, camera_id, format);

Page 75: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

67

% Set the properties of the video object set(mt, 'FramesPerTrigger', Inf); set(mt, 'ReturnedColorspace', 'rgb') mt.FrameGrabInterval = 5; % Start the video aquisition start(mt) %% Main Loop for i = 1:26; %% Motion Tracking for j = 1:30; % Get the snapshot from the camera picture = getsnapshot(mt); % Track color objects in real time % 1 for red; 2 for green; 3 for blue modified_pic = imsubtract(picture(:,:,3), rgb2gray(picture)); %Use a median filter to filter out noise modified_pic = medfilt2(modified_pic, [3 3]); % Convert the resulting grayscale image into a binary image modified_pic = im2bw(modified_pic,0.18); % Remove all small components less than 300px modified_pic = bwareaopen(modified_pic,300); % Label all the connected component in the image obj = bwlabel(modified_pic, 8); % Image blob analysis box = regionprops(obj, 'BoundingBox', 'Centroid'); % Display the image imshow(picture) hold on % Main loop for motion tracking for object = 1:length(box) bb = box(object).BoundingBox; bc = box(object).Centroid; % Local time c = clock; % cc = strcat(num2str(c(5)),'min',num2str(c(6)),'sec','----',num2str(bc)); % Speed calculation if (x == -1 && y == -1) speed = 0; else distance = sqrt((bc(1)-x)^2+(bc(2)-y)^2); if (c(6) > secend_hand) delta_t = c(6) - secend_hand;

Page 76: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

68

else dela_t = c(6) + 60 - secend_hand; end speed = distance/delta_t; end x = bc(1); y = bc(2); secend_hand = c(6); %disp(cc); %disp(distance); %disp(delta_t); disp(speed); % Rectangular box for display rectangle('Position',bb,'EdgeColor','r','LineWidth',2) plot(bc(1),bc(2), '-m+') a = text(bc(1)+15,bc(2), strcat('X: ', num2str(round(bc(1))), 'Y: ', num2str(round(bc(2))))); set(a, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize', 12, 'Color', 'yellow'); end % Data aquisition IP = [IP; light_intensity]; OP = [OP; speed]; pause(0.5); hold off end %% Send PWM Value to the Controllable Light Source light_intensity = light_intensity + fade_value; if (light_intensity == 250) fade_value = -10; elseif (light_intensity == 0) fade_value = 10; end disp(light_intensity) fwrite(s, light_intensity, 'uint8', 'sync'); end % Stop the video acquisition stop(mt); % Flush all the image data stored in the memory buffer flushdata(mt); % Save the data of input and output save 'SI01.mat' IP OP; % Clear all the data

Page 77: EXECUTIVE SUMMARY - Sheffield · EXECUTIVE SUMMARY . ... AIMS AND OBJECTIVES . The project is aimed to design, implement and study an experimental platform that ... Ethology, the

69

fclose(s); delete(s); clear;clc; %% System Identification load SI01.mat; figure; boxplot(OP,IP); degree = 2; p = polyfit(IP,OP,degree); y = polyval(p,IP); figure; plot(IP,y);

A.3 Programme for Analysis of the Result of System Identification

clear; clc; load SI053; newop = []; newip = []; for n = 1:26 s_n = []; li_n = []; for m = 1:30 s_n = [s_n; OP((n-1)*30+m)]; li_n = [li_n; IP((n-1)*30+m)]; end means = mean(s_n); newop = [newop; means]; meanli = mean(li_n); newip = [newip; meanli]; end plot(newip,newop)